来自北京工业大学的六人团队在两个月内使用多种工具开发了这款应用。 该团队使用了 Maya* 3D创建 3D 模型,使用 Unity* 3D渲染 3D 场景和开发应用逻辑,然后使用英特尔感知计算软件开发套件 Unity 3D 插件(包含在软件开发套件中)将所有组件结合起来。 该演示结合 3D 模型和动画视频,创建了一种与虚拟世界交互的新方式。 该款应用鼓励用户通过移动身体,使用手势、语音和触摸在未知世界中进行数字开发,未来工作将会非常令人期待。
关于恐龙
借助 AR 视觉效果,ARPedia 相当于一款编写和体验故事的游戏。 随着用户逐渐习惯无缝交互式体验,许多技术开始创建交互式体验,即使这种交互式体验非常简单。 在一款 PC 游戏中,普通的鼠标和键盘或触摸屏,是与应用交互的常见方式。 但是,ARPedia 没有使用上述任何一种方式。 在一款 AR 应用中,自然用户界面非常重要。 ARPedia 用户能够通过裸手手势和面部移动来控制动作,这要归功于 Creative Senz3D* 摄像头。 许多有趣的手势能够帮助提高游戏体验,如抓、挥、点、抬和按。 这些手势让玩家成为游戏以及虚拟恐龙世界的真正控制者。
图 1:ARPedia* 结合了增强现实和基于维基的百科全书,支持用户使用手势导航界面。
组长 Zhongqian Su 曾在以前的一个任务中使用小型雷克斯霸王龙的角色创建过教学应用,因此,他让这位众所周知的恐龙作为 ARPedia 应用的主角。 玩家通过手部运动伸手抵达并摘取小型的恐龙图片,然后将其放在屏幕的各个点上。 根据恐龙所放置的位置,用户可以了解该生物的饮食、习惯和其他特征。
据团队成员 Liang Zhang 表示,该团队在使用该恐龙 3D 模型之前便针对教育市场编写了一款 AR 应用。 虽然,他们已经有一款应用作基础,但是还需要根据竞赛的要求做大量的调整。 例如,他们已经完成的摄像头使用了 3D 技术,因此他们需要重新编写该代码(见图 3),以便与更新的 Creative Interactive Gesture Camera Kit 相融。 这同时也意味着需要快速达到英特尔感知计算软件开发套件的性能。
全球的开发人员都在学习英特尔® RealSense™ 技术。 英特尔在 CES 2014 上宣布,英特尔 RealSense 技术是以前的 英特尔® 感知计算技术的新名称和品牌。 该直观新用户界面采用英特尔在 2013 年推向市场的手势和语音等功能。 借助英特尔 RealSense 技术,用户将会获得其他的新功能,包括扫描、修改、打印和以 3D 形式共享,以及 AR 接口中的主要优势。 借助这些新功能,用户可以在游戏和应用中使用高级手、指感应技术,自然地操作和播放扫描的 3D 对象。
Zhang 现在能够直接看到其他开发人员如何使用 AR 技术操作。 在 CES 2014 上,他了解了来自全球的演示。 虽然每个演示都是独一无二的,并希望达到不同的目标,但是他仍然从中发现了快速发展 3D 摄像头技术所带来的优势。 “在软件开发套件中包含手势检测非常有帮助。 人们仍然能够以不同的方式使用摄像头,但是软件开发套件已经为他们提供了广泛的基础。 我建议开发人员使用该技术开发自己的项目,并寻找功能充分地开发其理念。”
借助高级手—指追踪,开发人员可支持其用户使用复杂的 3D 操作以更高的精度、更简单的命令来控制设备。 借助自然语言语音技术和精确的面部识别,设备能够更好地了解其用户的需求。
深度感应可带来更逼真的游戏体验,准确的手-指追踪可为任何虚拟冒险带来更卓越的追踪。 游戏将变得更加逼真和有趣。 借助 AR 技术和手指感应技术,开发人员将能够吧真实世界和虚拟世界融为一体。
Zhang 相信即将推出的英特尔 RealSense 3D 摄像头将会非常适合他所熟悉的应用场景。 他表示:“据我所知,它将更加出色 — 更准确、具备更多功能、更直观。 我们非常期待这款产品。 此外,它还会加入 3D 面部追踪和其他的出色特性。 它是首款面向笔记本电脑,并用作动作感应设备的 3D 摄像头,但是它不同于 Kinect。 此外,它还能够提供与内部 3D 摄像头一样的功能。 我认为,新的英特尔摄像头是支持制造商向笔记本电脑和平板电脑集成的更出色的设备。 此外,作为一款微型用户接口设备,它还具备很好的便携性优势。 借助该款摄像头,将来我们肯定能够开发出许多出色的项目。”
Maya 3D
ARPedia 团队使用 Maya 3D 模拟软件继续开发其知名的小型、逼真的模型 — 小雷克斯霸王龙。 构建合适的模型(包括逼真的动作和精细的色彩),应用的其他部分便水到渠成。
Maya 是创建 3D 计算机动画、建模、模拟、渲染等的黄金标准。 它是一款高可扩展的产品平台,可支持下一代显示技术,加开建模工作流程的速度和处理复杂数据。 该团队尚未使用过 3D 软件,但是他们使用过 Maya,并能够轻松地更新并与其现有的图形相集成。 Zhang 表示,其团队又额外花费时间进行了图形的开发。 他表示:“我们花费了将近一个月的时间设计和修改图形,以便让一切更完美和提高交互方式。”
Tencent wanted to give gamers the best experience on Intel® Ultrabook™ and 2 in 1 systems. Legend of Xuan Yuan was already a successful game, but these systems provided Tencent with a new opportunity. Many systems currently provide 2 in 1 usage, meaning they can be handled as a traditional laptop or a tablet. Tencent worked with Intel engineers to detect the laptop and tablet modes to change the game’s state. They updated the UI to support touch, which has become one of the most essential and exciting features on tablets. Finally, the system’s sensors allowed new gameplay by including “shake” to enable a special action in the game.
Introducing the first touch 3D MMOPRG for the Chinese market
Tencent is the biggest game developer in China. With a growing number of 2 in 1 systems in the Chinese market, Tencent wanted to give their players a unique experience. After two years in the market, Legend of Xuan Yuan was already a popular title. The availability of Ultrabooks and 2 in 1 systems made it the right time to add touch and accelerometer support to the game. Although 3D MMORPGs are very popular in China, none of them supported touch before Legend of Xuan Yuan. Tencent had a chance to innovate, but there was also risk – would the changes be successful? This case study illustrates how, working with Intel engineers, Tencent changed the game to play well on 2 in 1 systems and Ultrabooks running Windows* 8.
Legend of Xuan Yuan needs two different UIs for tablet and laptop modes. On 2 in 1 systems, the game detects when the system is used as a laptop versus a tablet. The game uses keyboard and mouse when the system is in laptop mode. When it’s used as a tablet, the game switches to a touch-only UI. Tencent wanted an effortless transition between the traditional laptop mode and touch gameplay. The player has a seamless experience because the UI changes automatically to suit each mode. In this case study, we’ll look at how to detect the mode of a 2 in 1 system and change the UI based on that mode.
Converting an existing user interface to touch can be difficult. It’s especially hard for games with rich UIs that rely on left-click, right-click, and multiple command keys. There’s no single formula for adapting this kind of UI. It requires great care to deliver smooth and satisfying gameplay via touch. Because the game had an existing installed base, the team was careful to make the smallest changes possible and not alienate existing players. We’ll review the UI design.
Since these systems include an accelerometer, Tencent also added support for a “super-kill” attack against opponents when you shake the system during gameplay.
Changing game mode to match the 2 in 1 state
Legend of Xuan Yuan has two UI personalities and dynamically changes the UI based on the state of a 2 in 1 system. When the system is in laptop mode, Legend of Xuan Yuan plays as it always has with keyboard and mouse input. When the system is in tablet mode, the player uses touch input. How does this work?
Here’s how we did it: Detecting 2 in 1 state changes and changing UI mode
Legend of Xuan Yuan listens for the WM_SETTINGCHANGE message. This message notifies apps when the system changes state. The WM_SETTINGCHANGE message comes with its LPARAM pointing to a string with a value of “ConvertibleSlateMode” when the 2 in 1 state changes. A call to GetSystemMetrics(SM_CONVERTIBLESLATEMODE) reveals the current state.
When the game is in tablet mode, it displays an overlay UI with touch buttons for the various UI actions. It hides the overlay UI in laptop mode.
Legend of Xuan Yuan uses detection code like this:
This technique must be enabled by the system OEM, with a supporting driver installed, in order to work. In case it’s not properly enabled on some systems, we included a manual way to change the UI configuration. A later section describes how the UI works.
Now it’s your turn: Detect if the system is used as a laptop or tablet
How can you detect the system state for your game and how should it change its UI? To play best on 2 in 1 systems, your game should have dual personalities and dynamically change its UI based on the state of the system.
First plan a UI for both laptop and tablet modes. Consider how the system might be held or placed. Pick the UI interactions that work best for your players. As you design the touch interface, you should reserve more screen area for your touch controls than you typically need for mouse buttons. Otherwise, players will struggle to reliably press the touch controls.
Touch interactions are often slower than keyboard and mouse, so keep that in mind too. Game menus also need both a keyboard plus mouse UI and a touch UI.
It’s a good idea to check the system state at startup with GetSystemMetrics and set your UI accordingly. Remember that not all systems will correctly report their state or notify your game of state changes, so choose a default startup state for your game’s UI in case the state isn’t detected.
Listen for the WM_SETTINGCHANGE message once the game is running. When the message arrives, check its contents for an LPARAM pointing to a string with a value of “ConvertibleSlateMode”. That value indicates that the game should call GetSystemMetrics(SM_CONVERTIBLESLATEMODE) and check if the UI should change.
Detection may not be conclusive because all systems may not correctly report state changes. Your game should probably default to laptop mode if the detection doesn’t give certain results. It should definitely include a way to manually change the UI between keyboard/mouse mode and tablet mode.
For a complete sample that detects system state and changes its UI, look at the 2 in 1 aware sample. To see a more complex sample that detects docking, screen orientation, and more, check the detecting state sample.
Deciding on a touch message type
You’ll need to decide which Window message type to support before you can add touch support to an existing application. Choose one of the three different sets of Window messages: WM_POINTER, WM_GESTURE, or WM_TOUCH. We’ll walk through the decision process used for Legend of Xuan Yuan and examine ways you can do the same for your game.
How we did it: Comparing touch message types
Touch support is at the center of the new version of Legend of Xuan Yuan. When players use the touch screen, they see a new UI with a set of touch controls on screen.
WM_POINTER is the easiest message type to code, and it supports a rich set of gestures. WM_POINTER only runs on Windows 8 and beyond. Tencent wanted to support a large installed base of Windows 7 players, so WM_POINTER was not the right choice.
Before we discuss the remaining message types, let’s review the key UI elements for Legend of Xuan Yuan. The game’s touch UI uses on-screen button controls. These controls can be used for movement and actions at the same time. The movement and action controls are on opposite sides of the screen, for use with two hands. These controls are in the bottom corners of the screen. There’s also an icon near the top of the screen to bring up a cascading menu for more complex UI elements. We’ll discuss the design of the UI later, but this gives us a good idea how the UI elements must work.
Image may be NSFW. Clik here to view. Figure 2: On-screen touch overlay UI, in left and right bottom corners
The game must recognize simultaneous points of contact from different parts of the screen. Because multiple touches must work at the same time, we refer to this as multi-touch.
Now that we understand the main parts of the multi-touch UI, we can compare the remaining touch message types: WM_GESTURE and WM_TOUCH. The easiest one to code is WM_GESTURE, which has simple support for typical gestures like a two-finger pinch (zoom) and finger swipe (pan). This message type hides some of the detail of touch interaction and presents your code with a complete gesture once the gesture is done. Simple touch events are still sent to your game as mouse messages. This means a typical touch interface could be implemented using mouse messages for simple touch events plus WM_GESTURE for complex gestures.
The gestures supported by WM_GESTURE can only include one set of related touch points. This makes it difficult to support gestures from this kind of multi-touch UI where the player touches the screen in different places. WM_GESTURE is a poor choice for this game.
WM_TOUCH is the lowest-level touch message type. It gives complete access to all touch events (e.g., “finger down”). WM_TOUCH requires you to do more work than the other message types since you must write code to represent all high-level touch events and gestures out of low-level touch messages. In spite of the extra work required, WM_TOUCH was the clear choice for Legend of Xuan Yuan. WM_TOUCH gave complete control over all touch interaction including multi-touch.
When there’s a physical touch on the screen, the system sends WM_TOUCH messages to the game. The game also receives a mouse click message at the same time. This makes it possible for apps without full touch support to behave properly with touch. Because these two messages of different types describe the same physical event, this can complicate the message handling code. Legend of Xuan Yuan uses mouse-click messages where possible and discards duplicate messages.
Your turn: Choosing the right touch message type for your game
WM_POINTER is a great option if your game will only be used on Windows 8. If you need backward compatibility, look at both WM_GESTURE and WM_TOUCH messages.
Consider your UI design as you compare the message types. If your UI relies heavily on gestures, and you can easily write mouse-click handlers for the non-gesture single touch events, then WM_GESTURE is probably right for your game. Otherwise, use WM_TOUCH. Most games with a full-featured UI use WM_TOUCH, especially when they have multiple controls that players will touch at the same time.
When you evaluate the touch messages, don’t forget the menu system. Remember also to discard extra messages that arrive as mouse clicks.
Adapting an existing game UI to touch can be complex, and there’s no single formula for how to do it well.
How we did it: A new touch UI
The keyboard and mouse UI is familiar. It uses the W, A, S, D keys to move the character. Customizable action keys on the bottom of the screen and shortcut keys 1-9 hold potions, attack skills, and open richer UI elements. These UI elements include inventory, skill tree, task, and map. Right-click selects the character’s weapon and armor or opens a treasure box.
The touch screen is available at all times, but the touch UI is hidden by default during keyboard and mouse gameplay. A touch button is visible on-screen in this mode.
Image may be NSFW. Clik here to view. Figure 3: Pressing this touch button in this mode brings up the touch UI
If the player switches the system to tablet mode or touches this button, the full touch UI appears on-screen.
How we did it: Elements of the touch UI
In tablet mode, the player usually holds the system with both hands. The UI layout uses both thumbs to minimize any grip changes. Move and attack actions are grouped for easy access by the player’s left and right thumbs.
First, we designed a wheel control to move the character. The wheel is an overlay on the left side of the screen. This is similar to game controllers, and this familiar use and placement makes it easy to use. The player’s left thumb will usually be in constant contact with the screen. As they slide their thumb around, the character moves on-screen in the direction of the player’s thumb.
The regular in-game action bar is at the bottom of the screen, but that doesn’t work well for thumb use. We added a group of 4 large action buttons in the bottom right corner where the player’s right thumb can easily reach them. The player can configure these to trigger their most frequently-used actions by dragging attack skills or potions to each button.
The player must target an enemy before attacking. With the keyboard/mouse interface, left-click targets a single enemy and TAB targets the next enemy within attack range. In touch mode there’s a large button to target the next close enemy. The player can also touch an enemy to target them directly, but that’s not common since it disrupts the player’s grip on the tablet.
The keyboard and mouse UI uses right-click to open a treasure box, equip a weapon or armor, or drink potions. Tap and hold is the best touch replacement for right-click, so it replaces right-click for the touch UI.
With the keyboard/mouse UI, there’s a small icon on-screen to open the cascaded windows. This doesn’t work well for touch since the icons are too small. The touch UI includes an icon on-screen to bring up the rest of the UI elements through a cascading set of icons. These icons bring up more complex parts of the UI like the inventory bag, skill tree, tasks, etc. There is also an option to toggle the UI between the keyboard/mouse and the touch overlay. This gives the player an easy way to change between the two UIs.
Image may be NSFW. Clik here to view. Figure 4: Full touch UI with movement wheel, action and target buttons, and the cascading UI displayed
Here’s the full touch UI with the cascading icons open.
How we did it: Message handling for the touch UI
How does the message handling work? It varies for different parts of the UI. Both WM_TOUCH and mouse messages are used. The action, targeting, and cascading UI buttons all use mouse click messages. The movement wheel and main part of the game screen use WM_TOUCH messages.
Typical gameplay involves continuous touching on the movement wheel control, with repeated use of the enemy selection and skill attack buttons. This means that good multi-touch support is essential. Luckily, WM_TOUCH has good support for multi-touch.
When there’s a WM_TOUCH message, the game saves some context. It compares this touch with other recent WM_TOUCH messages, checks how long the current sequence of touches has been held, and looks for the location of the touch.
If the WM_TOUCH message was on or near the movement wheel, the code checks the location of the touch relative to the center of the wheel and the previous touch. If the touch was close to a previous touch and this current gesture started on the wheel, the game moves the character in the desired direction. During development, this required some careful configuration to detect the difference between long continuous touches on the movement wheel and other touches on the main part of the screen.
If a WM_TOUCH message is on the screen away from the other controls, then it might be part of a gesture like zoom or pan, or it may be part of a tap-and-hold. WM_TOUCH messages are compared with previous ones to decide which action to take. If it’s close enough to the first and has been held for longer than 0.2 seconds, it’s treated as a tap-and-hold. Otherwise, it’s a gesture so the screen is adjusted to match.
The system also automatically generates mouse messages for all touch messages. Each mouse message includes extra information detailing where it came from. The GetMessageExtraInfo call identifies the difference.
#define MOUSEEVENTF_FROMTOUCH 0xFF515700
if ((GetMessageExtraInfo() & MOUSEEVENTF_FROMTOUCH) == MOUSEEVENTF_FROMTOUCH) {
// Click was generated by wisptis / Windows Touch
}else{
// Click was generated by the mouse.
}
Figure 5: Check if mouse messages came from touch screen
When a mouse message was generated from the touch screen and the game has already handled the physical touch via WM_TOUCH, the game discards the mouse message.
If a touch message is on one of the other controls, then it is discarded and the mouse message is used instead.
With all UI elements in place, the game plays well with a touch screen.
Before you build a touch UI for your game, imagine all the actions a player might take. Then think about how they might be done with touch (or other sensors like the accelerometer). Pay special attention to the differences between tap and click, continuous actions like press-and-hold, and gestures like drag.
Decide how the player will do all of these actions with touch and where the visible controls should be. For any on-screen controls or cascading menus, ensure they are big enough to use with a fingertip or thumb. Think about how your typical player will hold the system, and design your UI for easy touch access with a typical grip.
Now that you have the UI planned, use the simplest message for the job. Identify when a touch hits each control. Plan which message type to use for those controls (mouse or touch) and discard duplicate messages.
For touch messages, save the context of the touch message. Location, control, and timing will all be useful when you need to compose gestures out of multiple touch messages. Think about the parts of your gameplay that require continuous touch contact. Carefully test this during development to make sure that your game works well with typical variations in gestures. Check a variety of gesture directions, touch locations, proximity to previous touches, and touch durations.
Start the UI in whatever mode matches the system’s current state. Switch the UI between touch and keyboard/mouse whenever the system state changes. Finally, remember to include a manual way to force the UI change in case the system isn’t configured to notify you properly.
Ultrabook and 2 in 1 systems include sensors like gyroscope, accelerometer, GPS, etc. It’s possible to enhance the gameplay experience with them.
How we did it: Shake for a special action
Legend of Xuan Yuan uses the accelerometer to detect when the player shakes the system. The player accumulates energy during gameplay, then releases it during a super kill attack. The player can shake the system to trigger the super kill, which attacks nearby enemies for 10-20 seconds.
We tested some different shake actions to measure typical values from the accelerometer:
Image may be NSFW. Clik here to view. Figure 6: Four shake actions, showing intensity and duration in 3 dimensions
Any acceleration over 1.6 on one axis is a shake. We could also use the sum of the absolute values of acceleration on each axis.
Because these are real-world events, the data will be noisy and different each time. The values include both long and short shakes. While most of our test shakes gave a single peak value, one of them had several near-peak values. This game uses any shake over 1.6 in any direction on any axis. Multiple shakes within 1.5 seconds are grouped together as one.
With this code in the game, any shake action will unleash a super kill action.
Your turn: Use the system’s sensors
Ultrabook and 2 in 1 systems contain a number of sensors. Be creative, and think of ways you might use each of them to enhance your gameplay.
Whichever sensor(s) you use, calibrate them to see how they react in real-world conditions. Consider the different typical conditions your players will encounter.
Summary
We’ve shown how to adapt an existing game to detect the state (laptop or tablet) of a 2 in 1 system. We also demonstrated how the UI can support touch and how to switch between UIs based on the 2 in 1 system state. Along with the accelerometer to trigger a unique action in the game, these give a compelling game experience.
Tencent took a risk by introducing the first Chinese MMORPG to support touch gameplay. The risk has paid off! Legend of Xuan Yuan plays great on laptops, tablets, and 2 in 1 systems. We hope you have similar success with your game!
Authors
Mack Han is a game client software engineer for Tencent with 10 years of game development experience. He has built games for console, PC, and mobile. He has been working with a big 3D MMORPG for years, specializing in rendering and optimizing.
Cage Lu is an Application Engineer at Intel. He has been working with big gaming ISVs in China for several years to help them optimize game client performance and user experience on Intel® platforms.
Paul Lindberg is a Senior Software Engineer in Developer Relations at Intel. He helps game developers all over the world to ship kick-ass games and other apps that shine on Intel platforms.
Intel sample source is provided under the Intel Sample Source License Agreement. Portions of this document are subject to the Microsoft Limited Public License.
Intel®Developer Zone offers tools and how-to information for cross-platform app development, platform and technology information, code samples, and peer expertise to help developers innovate and succeed. Join our communities for the Internet of Things, Android*, Intel® RealSense™ Technology and Windows* to download tools, access dev kits, share ideas with like-minded developers, and participate in hackathons, contests, roadshows, and local events.
XNA Game Studio 可提供为 Windows 和 Xbox 360 创建 XNA 游戏所需的一切组件。 此外,它还包括内容编译器,可将资产编译至 .xnb 文件,然后编译 MonoGame 项目所需的一切文件。 目前,仅可在 Visual Studio 2010 中安装编译器。 如果你不希望仅出于该原因来安装 Visual Studio 2010,则可在 Visual Studio 2012 中安装 XNA Game Studio(详见本文“了解更多信息”部分的链接)。
Windows Phone 8 SDK
你可以在 Visual Studio 2012 中直接安装 XNA Game Studio,但是在 Visual Studio 2012 中安装 Windows Phone 8 SDK 更好。 你可以用它创建项目来编译资产。
protected override void LoadContent()
{
// Create a new SpriteBatch, which can be used to draw textures.
_spriteBatch = new SpriteBatch(GraphicsDevice);
// TODO: use this.Content to load your game content here
_backgroundTexture = Content.Load<Texture2D>("SoccerField");
_ballTexture = Content.Load<Texture2D>("SoccerBall");
}
请注意,纹理的名称与内容(Content )文件夹中的文件名称相同,但是没有扩展名。
接下来,在 Draw 方法中绘制纹理:
protected override void Draw(GameTime gameTime)
{
GraphicsDevice.Clear(Color.Green);
// Set the position for the background
var screenWidth = Window.ClientBounds.Width;
var screenHeight = Window.ClientBounds.Height;
var rectangle = new Rectangle(0, 0, screenWidth, screenHeight);
// Begin a sprite batch
_spriteBatch.Begin();
// Draw the background
_spriteBatch.Draw(_backgroundTexture, rectangle, Color.White);
// Draw the ball
var initialBallPositionX = screenWidth / 2;
var ínitialBallPositionY = (int)(screenHeight * 0.8);
var ballDimension = (screenWidth > screenHeight) ?
(int)(screenWidth * 0.02) :
(int)(screenHeight * 0.035);
var ballRectangle = new Rectangle(initialBallPositionX, ínitialBallPositionY,
ballDimension, ballDimension);
_spriteBatch.Draw(_ballTexture, ballRectangle, Color.White);
// End the sprite batch
_spriteBatch.End();
base.Draw(gameTime);
}
if (TouchPanel.IsGestureAvailable)
{
// Read the next gesture
GestureSample gesture = TouchPanel.ReadGesture();
if (gesture.GestureType == GestureType.Flick)
{…
}
}
if (gesture.GestureType == GestureType.Flick)
{
_isBallMoving = true;
_isBallHit = false;
_startMovement = gameTime.TotalGameTime;
_ballVelocity = gesture.Delta*(float) TargetElapsedTime.TotalSeconds/5.0f;
}
...
var timeInMovement = (gameTime.TotalGameTime - _startMovement).TotalSeconds;
// reached goal line or timeout
if (_ballPosition.Y <' _goalLinePosition || timeInMovement > 5.0)
{
_ballPosition = new Vector2(_initialBallPosition.X, _initialBallPosition.Y);
_isBallMoving = false;
_isBallHit = false;
while (TouchPanel.IsGestureAvailable)
TouchPanel.ReadGesture();
}
添加守门员
游戏现在可以运行了,但是它还需要一个制造难度的元素:你必须添加一个守门员,在用户踢出足球后一直运动。 守门员是 XNA 内容编译器编译的 .png 文件(图 6)。 我们必须将该编译文件添加至 Content 文件夹,为 Content 设置构建操作,并将“复制至输出目录 (Copy to Output Directory)”设置为“如果较新则复制(Copy if Newer)”。
Image may be NSFW. Clik here to view.
图 6.守门员
守门员在 LoadContent方法中加载:
protected override void LoadContent()
{
// Create a new SpriteBatch, which can be used to draw textures.
_spriteBatch = new SpriteBatch(GraphicsDevice);
// TODO: use this.Content to load your game content here
_backgroundTexture = Content.Load<Texture2D>("SoccerField");
_ballTexture = Content.Load<Texture2D>("SoccerBall");
_goalkeeperTexture = Content.Load<Texture2D>("Goalkeeper");
}
然后,我们必须在 Draw方法中绘制它:
protected override void Draw(GameTime gameTime)
{
GraphicsDevice.Clear(Color.Green);
// Begin a sprite batch
_spriteBatch.Begin();
// Draw the background
_spriteBatch.Draw(_backgroundTexture, _backgroundRectangle, Color.White);
// Draw the ball
_spriteBatch.Draw(_ballTexture, _ballRectangle, Color.White);
// Draw the goalkeeper
_spriteBatch.Draw(_goalkeeperTexture, _goalkeeperRectangle, Color.White);
// End the sprite batch
_spriteBatch.End();
base.Draw(gameTime);
}
对于台式机应用,我们可以使用面向 Microsoft .NET Framework 的 Windows API Code Pack,它是一款支持访问 Windows 7 及更高版本操作系统特性的库。 但是,在该游戏中,我们采用了另一种方式:WinRT Sensor API。 这些 API 虽然面向 Windows 8 而编写,但是同样适用于台式机应用,且不经任何更改即可使用。 借助它们,你无需更改任何代码即可将应用移植到 Windows 8。
英特尔® 开发人员专区(IDZ)包括一篇如何在台式机应用中使用 WinRT API 的文章(详见“了解更多信息”部分)。 基于该信息,你必须在 Solution Explorer 中选择该项目,右击它,然后点击 Unload Project。 然后,再次右击该项目,并点击 Edit project。 在第一个 PropertyGroup中添加 TargetPlatFormVersion标签:
该代码示例使用的 OpenCL 内核与 ToneMapping 示例相同(参见下文参考文献),后者曾在面向 OpenCL 应用的英特尔® SDK 中发布[2]。 该简单内核旨在帮助由于太暗或太亮而分辨不清的图片清晰可见。 它从输入缓冲区中读取像素,对其进行修改,然后将其编写至输出缓冲区的同一位置。 关于该内核如何运行的更多信息,请参见文档高动态范围色调映射后期处理效果(High Dynamic Range Tone Mapping Post Processing Effect )[3]。
OpenCL 实施
SampleOCL 示例应用并不是要“包装” OpenCL,也就是说,它并非要使用“更高级的” API 来替换 OpenCL API。 总体而言,我发现,这些包装器不比直接使用 OpenCL API 更简单或更整洁,而且,虽然最初创建包装器的编程人员认为其使用起来更简单,但是包装器会对维护代码的 OpenCL 编程人员带来很大负担。 OpenCL API 是一个标准。 如要在专门的“改进” API 中对其进行包装,则需要丢弃许多使用该标准时的值。
依次说法,SampleOCL 实施的确使用了一些 C++ 类及相关方法将 OpenCL API 分成了几组。 该应用主要分为两大类,以区分通用应用元素和与 OpenCL 相关的元素。 前者是 C_SampleOCLApp,后者是 C_OCL。
async void initiator_InviteeResponded(STCSession session, InviteResponse response)
{
await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, () =>
{
switch(response)
{
case InviteResponse.ACCEPTED:
// You are connected to the user.
break;
case InviteResponse.REJECTED:
//Invite was rejected by the remote user.
break;
case InviteResponse.TIMEDOUT:
//No response. Invite time-out.
break;
}
});
}
Brown 以他的 BlinkTalk应用赢得了 2012 年度英特尔应用创新大赛的冠军。对他而言,SensiGator 带来了全新的编程挑战,包括如何将 Windows 运行时(又称 WinRT) API 应用于 Windows 8 桌面应用,如何在触摸屏上模拟键盘和鼠标操作,以及为了提供流畅的用户体验,如何仅使用机载传感器(尤其是测斜仪和陀螺测试仪)导航地图。
另一个挑战是检测平板电脑是横向还是纵向,然后相应地处理地图导航。SensiGator 用于在平板电脑横向正面朝上时操作,但在纵向上也能正常工作。然而,当平板电脑的方向改变时,应用需要重新定义如何使用平板电脑的倾斜仪数据执行地图导航。片段 4 中的代码片段表明了 Brown 的应用如何首先检测平板电脑的朝向,然后再重新定义 X 轴和 Y 轴。
Brown 在平板电脑上完成了所有代码编辑工作,能够更加轻松地测试各种功能,“平板电脑是消费的最佳选择,但不适合创作内容”的传统观点并不完全正确。他说,不必为了测试将应用下载到平板电脑上,这可能促进了他能够在六周内完成该应用。这种体验也不是没有缺点:例如,构建过程比在开发级别的计算机上消耗更多时间。
Brown 说道,如果他要开发另一款代码不是太多的应用,将再次尝试在平板电脑上开发,虽然他也会考虑可能支持脱离平板电脑工作的集成开发环境。他说,重点在于寻找到一款包含所有传感器、并且能够在桌面运行的优秀的 Windows 平板电脑模拟器,就像用于构建平板电脑、手机 Android* 和 iOS* 应用的模拟器一样。
Brown 是一名拥有多年经验的专业人员,他承认:“涉及到 Visual Studio,我并不确定目前在哪能找到工具,也不知道是否存在适当的模拟器环境。”这一言论本身暗示了在 Windows 8 平板电脑和触摸屏上进行开发的广泛可能性。
2013 年度应用创新大赛的关注者可能会认为 Brown 在业余编码爱好者之间一鸣惊人,也许 SensiGator的成功是凭借几分幸运。 然而,Brown 的成功并非昙花一现。 他早期的BlinkTalk应用为患有眼部下方随意肌麻痹的闭锁综合征患者提供了语言交流的途径。 Brown 还凭借他的 NuiLogix 项目(用于网络设备控制的感知计算实例)在 2013 年度英特尔感知计算挑战赛第一轮中获得了二等奖。
他的另一个项目 PerC 机器人手臂控制器获得了英特尔感知计算挑战赛第二轮的先锋奖。(Brown 的视频机器人手臂响应语音命令执行拾取、移动、并将棋子投入玻璃杯;能够调出美味鸡尾酒或照顾孩子的机器人将为期不远。)由于对 Brown 和他的应用印象深刻,英特尔为 Brown 提供了 2014 世界移动通信大会的参展证,让他在英特尔的展位上演示 SensiGator。
“我建议,永远不要把自己锁定在一个特定的语言、平台或工具,不要试图使用相同的技能组合解决每一个问题。这项大赛中最有趣的部分就是要求开发一款在 Windows 8 平板电脑上使用的 Windows 桌面应用。这可能会让倾向于原生 Windows 8 应用开发方法的开发人员措手不及,但其中的乐趣就是超越你的舒适区,并终将获得回报。”
资料来源
英特尔能够帮助开发人员解决像 Brown 一样的问题,充分利用最新版 Windows 8 和英特尔技术的优势,构建跨多种设备的灵活创新的应用。英特尔开发人员专区提供了适用于跨平台和桌面的世界一流的知识、工具和 Windows 8 应用支持。Brown 在工作过程中便从这里获得了丰富信息英特尔开发人员专区能够为像 Brown 一样有志于创作支持多种输入方式(触摸/手写笔以及键盘/鼠标)的二合一应用的开发人员,提供重要帮助。
开发本地英特尔® MIC 应用与在 IA-32 或英特尔® 64 主机上开发应用一样简单。 在大部分情况下,仅需对其进行交叉编译(/Qmic)。 但是,英特尔® MIC 架构不同于主机架构。 这些区别可以揭示目前的区别。 此外,对英特尔® MIC 调试不当可能会带来新的问题(如数据对齐,一款应用能否处理数百个以上线程?有效的内存消耗?等。)
为了解决上述问题,英特尔推出了一款新的软件开发套件Intel® Stereo 3D SDK。这是一个专门为立体3D游戏而设计的开发套件。Intel Stereo 3D SDK 主要面向Windows* 立体3D应用程序开发。它对游戏开发者开放API函数。SDK能够允许开发者简单的使用原来的相机参数作为API函数的输入参数,从而返回合适的立体相机参数,而不需要开发者额外提供其他的输入。SDK的这些便利使得开发者并不需要特别的立体3D知识就能开发出好的立体3D效果,从而可以专注于游戏本身的开发设计上。
SDK 当前支持左右3D格式和上下3D格式. 它具有以下4个主要的功能,如图1所示:
Image may be NSFW. Clik here to view. 图1. Intel® Stereo 3D SDK 的主要功能
D3D 12 借助降低精细度的 PSO 提供了更高的 CPU 效率。 开发人员现在无需设置和读取每个状态,而是通过一个点操作,从而降低或全面减少了硬件失配开销。 应用设置 PSO,而驱动程序处理 API 命令,并将其转换为 GPU 码。 资源绑定的新模型消除了以前由于必须使用控制流逻辑带来的混乱。
借助堆、表和捆绑包,D3D 12 在 CPU 效率和扩展性方面都得到显著提升。 线性绑定点替换为应用/游戏管理的内存对象。 频繁使用的命令可以通过捆绑包进行记录并在一帧或多帧中多次播放。 命令列表和命令队列支持在多个 CPU 线程间并行创建命令列表。 现在,大部分的任务能够在 CPU 的所有线程间平均分配,从而充分释放了第四代和第五代智能英特尔® 酷睿™ 处理器的强大潜力和性能。
Direct3D 12 是 PC 游戏技术的跨越式发展。 借助更简单的 API 以及层级更少的驱动程序,游戏开发人员能够“接近完美”。 这提高了效率和性能。 D3D 开发团队通过齐心协力的合作,创建了一个支持开发人员控制的全新 API 和驱动程序模型,该模型支持他们创建更符合他们理想并具备出色图形和性能的游戏。