微软Hololens学院教程-Hologram 211-Gestures(手势)

用户操作Hololens的全息对象时需要借助手势,手势的点击类似于我们电脑的鼠标,当我们凝视到某个全息对象时,手势点击即可以得到相应的反馈。这节教程主要讲述如何追踪用户的手势,并对手势操作做出反馈。

这节对手势操作会实现一些新的功能:

1 检测什么时候手势被追踪到然后提供反馈

2 使用导航手势来旋转全息对象。

3 当用户的手移开可检测的范围时提供反馈

4 使用操纵事件来移动全息对象。

前提条件:

  • 一台安装好开发工具的 Windows 10 PC  tools installed.
  • 一些基础的C# 编程能力.
  • 已经完成教程 Holograms 101.
  • 已经完成教程 Holograms 210.
  • 一台 开发者版本的HoloLens 设备

项目文件

  • 下载项目文件 files

勘误表与注释

VS中 "仅我的代码"需要被禁用。在工具-》选项-》调试-》可以找到“仅我的代码”

Unity设置

  • 打开 Unity.
  • 选择Open.
  • 定位到你刚下载的项目文件中的Gesture 文件夹.
  • 选择其中的 Model Explorer 文件夹.
  • 点击 Select Folder 按钮.
  • 在Unity中启动该项目后,在 Project 面板中, 展开 Scenes 文件夹.
  • 双击 ModelExplorer 场景将其加载到Unity中
  • 接下来可以第一次部署发布到你的设备。具体部署发布方法见前两节课程。

章节1 手势检测反馈

目标:
1 注册手势跟踪事件

2 使用光标的变化反馈来显示手是否被跟踪到

步骤:

  • Hierarchy 面板中, 选择 Managers 对象.
  • Inspector 面板中, 点击 Add Component 按钮.
  • 在搜索框输入 Hands Manager.选择此结果.

HandsManager.cs 脚本实现如下:

  1. 注册 SourceDetected 和 SourceLost 事件.
  2. 设置手势检测状态 HandDetected state.
  3. 取消注册 SourceDetected 和 SourceLost 事件.
  • Hierarchy 面板, 选择 Cursor 对象.
  • Inspector 面板中, 点击 Add Component 按钮.
  • 在搜索框输入Cursor Feedback. 选择此结果.
  • Project 面板下打开 Holograms 文件夹, 找到 HandDetectedFeedback 资源.
  • 拖拽HandDetectedFeedback 资源到Cursor Feedback (Script)组建中的Hand Detected Asset 属性里.
  • Hierarchy 面板里 ,展开 Cursor 对象.
  • 拖拽 CursorBillboard 对象到Cursor Feedback (Script)组建中的Feedback Parent 属性.

CursorFeedback.cs 脚本实现以下行为:

  1. 实例化 HandDetectedFeedback asset.
  2. 指定 HandDetectedFeedback asset 的父对象为 cursor billboard object.
  3. 基于手势检测状态来激活或停用 HandDetectedFeedback asset .

发布部署

  • 当项目部署到hololens 后, 使用 air tap 手势关闭适配盒.
  • 移动你的头部将凝视点对准全息对象,将你的食指竖起在你头部到正前方以便开始追踪手势.
  • 你可以上下左右移动你的手.
  • 查看当你的手被检测到与没有被检测到时光标的变化.

章节2 导航

使用导航手势(食指与大拇指捏合在一起的手势)来旋转全息对象。

步骤:

为了使用导航手势事件,你需要编辑如下四个脚本:

  1. 使用 HandsManager.cs 脚本可以使得目标全息对象被跟踪.
  2. 使用 GestureManager.cs 脚本可以管理导航手势事件.
  3. 当导航手势发生时,旋转操作被管理在 GestureAction.cs脚本中.
  4. 光标通过 CursorStates.cs脚本实现对导航手势操作的样式反馈.
  • VS打开 HandsManager.cs 脚本.更新为如下脚本保存

using HoloToolkit;
using UnityEngine.VR.WSA.Input;
using UnityEngine;

/// <summary>
/// HandsManager keeps track of when a hand is detected.
/// </summary>
public class HandsManager : Singleton<HandsManager>
{
    [Tooltip("Audio clip to play when Finger Pressed.")]
    public AudioClip FingerPressedSound;
    private AudioSource audioSource;

    /// <summary>
    /// Tracks the hand detected state.
    /// </summary>
    public bool HandDetected
    {
        get;
        private set;
    }

    // Keeps track of the GameObject that the hand is interacting with.
    public GameObject FocusedGameObject { get; private set; }

    void Awake()
    {
        EnableAudioHapticFeedback();

        InteractionManager.SourceDetected += InteractionManager_SourceDetected;
        InteractionManager.SourceLost += InteractionManager_SourceLost;

        /* TODO: DEVELOPER CODE ALONG 2.a */

        // 2.a: Register for SourceManager.SourcePressed event.
        InteractionManager.SourcePressed += InteractionManager_SourcePressed;

        // 2.a: Register for SourceManager.SourceReleased event.
        InteractionManager.SourceReleased += InteractionManager_SourceReleased;

        // 2.a: Initialize FocusedGameObject as null.
        FocusedGameObject = null;
    }

    private void EnableAudioHapticFeedback()
    {
        // If this hologram has an audio clip, add an AudioSource with this clip.
        if (FingerPressedSound != null)
        {
            audioSource = GetComponent<AudioSource>();
            if (audioSource == null)
            {
                audioSource = gameObject.AddComponent<AudioSource>();
            }

            audioSource.clip = FingerPressedSound;
            audioSource.playOnAwake = false;
            audioSource.spatialBlend = 1;
            audioSource.dopplerLevel = 0;
        }
    }

    private void InteractionManager_SourceDetected(InteractionSourceState hand)
    {
        HandDetected = true;
    }

    private void InteractionManager_SourceLost(InteractionSourceState hand)
    {
        HandDetected = false;

        // 2.a: Reset FocusedGameObject.
        ResetFocusedGameObject();
    }

    private void InteractionManager_SourcePressed(InteractionSourceState hand)
    {
        if (InteractibleManager.Instance.FocusedGameObject != null)
        {
            // Play a select sound if we have an audio source and are not targeting an asset with a select sound.
            if (audioSource != null && !audioSource.isPlaying &&
                (InteractibleManager.Instance.FocusedGameObject.GetComponent<Interactible>() != null &&
                InteractibleManager.Instance.FocusedGameObject.GetComponent<Interactible>().TargetFeedbackSound == null))
            {
                audioSource.Play();
            }

            // 2.a: Cache InteractibleManager‘s FocusedGameObject in FocusedGameObject.
            FocusedGameObject = InteractibleManager.Instance.FocusedGameObject;
        }
    }

    private void InteractionManager_SourceReleased(InteractionSourceState hand)
    {
        // 2.a: Reset FocusedGameObject.
        ResetFocusedGameObject();
    }

    private void ResetFocusedGameObject()
    {
        // 2.a: Set FocusedGameObject to be null.
        FocusedGameObject = null;

        // 2.a: On GestureManager call ResetGestureRecognizers
        // to complete any currently active gestures.
        GestureManager.Instance.ResetGestureRecognizers();
    }

    void OnDestroy()
    {
        InteractionManager.SourceDetected -= InteractionManager_SourceDetected;
        InteractionManager.SourceLost -= InteractionManager_SourceLost;

        // 2.a: Unregister the SourceManager.SourceReleased event.
        InteractionManager.SourceReleased -= InteractionManager_SourceReleased;

        // 2.a: Unregister for SourceManager.SourcePressed event.
        InteractionManager.SourcePressed -= InteractionManager_SourcePressed;
    }
}

HandsManager

  • Hierarchy 面板中, 点击 Cursor.
  • 在Project面板下 HoloToolkit\Input\Prefabs 文件夹, 找到 ScrollFeedback asset.
  • 拖拽 ScrollFeedback asset 到 右侧Inspector面板下的Cursor Feedback (Script)组件里的Scroll Detected Asset 属性。
  • Hierarchy 面板中, 点击AstroMan 对象.
  • Inspector 面板中, 点击 Add Component 按钮.
  • 在搜索框内输入 Gesture Action. 选择此结果.
  • Hierarchy 面板下, 选择 Managers 对象.
  • 用VS打开GestureManager 脚本。

我们需要编辑GestureManager.cs 脚本来完成如下几步:

  1. 将NavigationRecognizer实例化为新的GestureRecognizer.
  2. 使用 SetRecognizableGestures 来识别 NavigationXTap 手势.
  3. 处理 NavigationStarted, NavigationUpdated, NavigationCompleted, NavigationCanceled 事件.

using HoloToolkit;
using UnityEngine;
using UnityEngine.VR.WSA.Input;

public class GestureManager : Singleton<GestureManager>
{
    // Tap and Navigation gesture recognizer.
    public GestureRecognizer NavigationRecognizer { get; private set; }

    // Manipulation gesture recognizer.
    public GestureRecognizer ManipulationRecognizer { get; private set; }

    // Currently active gesture recognizer.
    public GestureRecognizer ActiveRecognizer { get; private set; }

    public bool IsNavigating { get; private set; }

    public Vector3 NavigationPosition { get; private set; }

    public bool IsManipulating { get; private set; }

    public Vector3 ManipulationPosition { get; private set; }

    void Awake()
    {
        /* TODO: DEVELOPER CODING EXERCISE 2.b */

        // 2.b: Instantiate the NavigationRecognizer.
        NavigationRecognizer = new GestureRecognizer();

        // 2.b: Add Tap and NavigationX GestureSettings to the NavigationRecognizer‘s RecognizableGestures.
        NavigationRecognizer.SetRecognizableGestures(
            GestureSettings.Tap |
            GestureSettings.NavigationX);

        // 2.b: Register for the TappedEvent with the NavigationRecognizer_TappedEvent function.
        NavigationRecognizer.TappedEvent += NavigationRecognizer_TappedEvent;
        // 2.b: Register for the NavigationStartedEvent with the NavigationRecognizer_NavigationStartedEvent function.
        NavigationRecognizer.NavigationStartedEvent += NavigationRecognizer_NavigationStartedEvent;
        // 2.b: Register for the NavigationUpdatedEvent with the NavigationRecognizer_NavigationUpdatedEvent function.
        NavigationRecognizer.NavigationUpdatedEvent += NavigationRecognizer_NavigationUpdatedEvent;
        // 2.b: Register for the NavigationCompletedEvent with the NavigationRecognizer_NavigationCompletedEvent function.
        NavigationRecognizer.NavigationCompletedEvent += NavigationRecognizer_NavigationCompletedEvent;
        // 2.b: Register for the NavigationCanceledEvent with the NavigationRecognizer_NavigationCanceledEvent function.
        NavigationRecognizer.NavigationCanceledEvent += NavigationRecognizer_NavigationCanceledEvent;

        // Instantiate the ManipulationRecognizer.
        ManipulationRecognizer = new GestureRecognizer();

        // Add the ManipulationTranslate GestureSetting to the ManipulationRecognizer‘s RecognizableGestures.
        ManipulationRecognizer.SetRecognizableGestures(
            GestureSettings.ManipulationTranslate);

        // Register for the Manipulation events on the ManipulationRecognizer.
        ManipulationRecognizer.ManipulationStartedEvent += ManipulationRecognizer_ManipulationStartedEvent;
        ManipulationRecognizer.ManipulationUpdatedEvent += ManipulationRecognizer_ManipulationUpdatedEvent;
        ManipulationRecognizer.ManipulationCompletedEvent += ManipulationRecognizer_ManipulationCompletedEvent;
        ManipulationRecognizer.ManipulationCanceledEvent += ManipulationRecognizer_ManipulationCanceledEvent;

        ResetGestureRecognizers();
    }

    void OnDestroy()
    {
        // 2.b: Unregister the Tapped and Navigation events on the NavigationRecognizer.
        NavigationRecognizer.TappedEvent -= NavigationRecognizer_TappedEvent;

        NavigationRecognizer.NavigationStartedEvent -= NavigationRecognizer_NavigationStartedEvent;
        NavigationRecognizer.NavigationUpdatedEvent -= NavigationRecognizer_NavigationUpdatedEvent;
        NavigationRecognizer.NavigationCompletedEvent -= NavigationRecognizer_NavigationCompletedEvent;
        NavigationRecognizer.NavigationCanceledEvent -= NavigationRecognizer_NavigationCanceledEvent;

        // Unregister the Manipulation events on the ManipulationRecognizer.
        ManipulationRecognizer.ManipulationStartedEvent -= ManipulationRecognizer_ManipulationStartedEvent;
        ManipulationRecognizer.ManipulationUpdatedEvent -= ManipulationRecognizer_ManipulationUpdatedEvent;
        ManipulationRecognizer.ManipulationCompletedEvent -= ManipulationRecognizer_ManipulationCompletedEvent;
        ManipulationRecognizer.ManipulationCanceledEvent -= ManipulationRecognizer_ManipulationCanceledEvent;
    }

    /// <summary>
    /// Revert back to the default GestureRecognizer.
    /// </summary>
    public void ResetGestureRecognizers()
    {
        // Default to the navigation gestures.
        Transition(NavigationRecognizer);
    }

    /// <summary>
    /// Transition to a new GestureRecognizer.
    /// </summary>
    /// <param name="newRecognizer">The GestureRecognizer to transition to.</param>
    public void Transition(GestureRecognizer newRecognizer)
    {
        if (newRecognizer == null)
        {
            return;
        }

        if (ActiveRecognizer != null)
        {
            if (ActiveRecognizer == newRecognizer)
            {
                return;
            }

            ActiveRecognizer.CancelGestures();
            ActiveRecognizer.StopCapturingGestures();
        }

        newRecognizer.StartCapturingGestures();
        ActiveRecognizer = newRecognizer;
    }

    private void NavigationRecognizer_NavigationStartedEvent(InteractionSourceKind source, Vector3 relativePosition, Ray ray)
    {
        // 2.b: Set IsNavigating to be true.
        IsNavigating = true;

        // 2.b: Set NavigationPosition to be relativePosition.
        NavigationPosition = relativePosition;
    }

    private void NavigationRecognizer_NavigationUpdatedEvent(InteractionSourceKind source, Vector3 relativePosition, Ray ray)
    {
        // 2.b: Set IsNavigating to be true.
        IsNavigating = true;

        // 2.b: Set NavigationPosition to be relativePosition.
        NavigationPosition = relativePosition;
    }

    private void NavigationRecognizer_NavigationCompletedEvent(InteractionSourceKind source, Vector3 relativePosition, Ray ray)
    {
        // 2.b: Set IsNavigating to be false.
        IsNavigating = false;
    }

    private void NavigationRecognizer_NavigationCanceledEvent(InteractionSourceKind source, Vector3 relativePosition, Ray ray)
    {
        // 2.b: Set IsNavigating to be false.
        IsNavigating = false;
    }

    private void ManipulationRecognizer_ManipulationStartedEvent(InteractionSourceKind source, Vector3 position, Ray ray)
    {
        if (HandsManager.Instance.FocusedGameObject != null)
        {
            IsManipulating = true;

            ManipulationPosition = position;

            HandsManager.Instance.FocusedGameObject.SendMessageUpwards("PerformManipulationStart", position);
        }
    }

    private void ManipulationRecognizer_ManipulationUpdatedEvent(InteractionSourceKind source, Vector3 position, Ray ray)
    {
        if (HandsManager.Instance.FocusedGameObject != null)
        {
            IsManipulating = true;

            ManipulationPosition = position;

            HandsManager.Instance.FocusedGameObject.SendMessageUpwards("PerformManipulationUpdate", position);
        }
    }

    private void ManipulationRecognizer_ManipulationCompletedEvent(InteractionSourceKind source, Vector3 position, Ray ray)
    {
        IsManipulating = false;
    }

    private void ManipulationRecognizer_ManipulationCanceledEvent(InteractionSourceKind source, Vector3 position, Ray ray)
    {
        IsManipulating = false;
    }

    private void NavigationRecognizer_TappedEvent(InteractionSourceKind source, int tapCount, Ray ray)
    {
        GameObject focusedObject = InteractibleManager.Instance.FocusedGameObject;

        if (focusedObject != null)
        {
            focusedObject.SendMessageUpwards("OnSelect");
        }
    }
}

GestureManager

接下来,需要用VS打开并编辑GestureAction.cs脚本,需要实现如下几点:

  1. 每当执行导航手势时,旋转AstroMan对象.
  2. 计算 rotationFactor以控制应用于对象的旋转量.
  3. 当用户向左或向右移动手时,围绕y轴旋转对象.

using UnityEngine;

/// <summary>
/// GestureAction performs custom actions based on
/// which gesture is being performed.
/// </summary>
public class GestureAction : MonoBehaviour
{
    [Tooltip("Rotation max speed controls amount of rotation.")]
    public float RotationSensitivity = 10.0f;

    private Vector3 manipulationPreviousPosition;

    private float rotationFactor;

    void Update()
    {
        PerformRotation();
    }

    private void PerformRotation()
    {
        if (GestureManager.Instance.IsNavigating &&
            (!ExpandModel.Instance.IsModelExpanded ||
            (ExpandModel.Instance.IsModelExpanded && HandsManager.Instance.FocusedGameObject == gameObject)))
        {
            /* TODO: DEVELOPER CODING EXERCISE 2.c */

            // 2.c: Calculate rotationFactor based on GestureManager‘s NavigationPosition.X and multiply by RotationSensitivity.
            // This will help control the amount of rotation.
            rotationFactor = GestureManager.Instance.NavigationPosition.x * RotationSensitivity;

            // 2.c: transform.Rotate along the Y axis using rotationFactor.
            transform.Rotate(new Vector3(0, -1 * rotationFactor, 0));
        }
    }

    void PerformManipulationStart(Vector3 position)
    {
        manipulationPreviousPosition = position;
    }

    void PerformManipulationUpdate(Vector3 position)
    {
        if (GestureManager.Instance.IsManipulating)
        {
            /* TODO: DEVELOPER CODING EXERCISE 4.a */

            Vector3 moveVector = Vector3.zero;

            // 4.a: Calculate the moveVector as position - manipulationPreviousPosition.

            // 4.a: Update the manipulationPreviousPosition with the current position.

            // 4.a: Increment this transform‘s position by the moveVector.
        }
    }
}

GestureAction

发布部署:
    1  凝视宇航员,两个箭头应该会出现在光标的两侧。 这个新的光标表示宇航员可以被旋转。
    2  将你的手放在可以识别的位置(食指指向天空),然后HoloLens将开始跟踪你的手。
    3  要旋转宇航员,将食指放低与大拇指合并,然后向左或向右移动手可以触发NavigationX手势。

章节 3 手势引导

使用 hand guidance score 来帮助预测 被检测的手势何时会丢失.当用户的手进入摄像机视角范围内时提供光标反馈。

步骤:

  • Hierarchy 面板中, 选择 Managers 对象.
  • 在右侧的 Inspector 面板中, 点击 Add Component 按钮.
  • 在搜索框内输入 Hand Guidance. 选择此结果.
  • Project 面板下的HoloToolkit\Input\Prefabs 文件夹,找到 HandGuidanceFeedback asset.
  • 拖拽 HandGuidanceFeedback asset 到右侧Inspector 面板下到Hand Guidance Indicator 属性.
  • Hierarchy 面板, 展开 Cursor 对象.
  • Hierarchy 面板, 选择 Managers 对象.
  • 拖拽Cursor对象下的 CursorBillboard 到右侧Inspector面板的Indicator Parent 属性中。

部署发布:

可以看到当你的手势进入到摄像机视角时,会出现一个小手图标表明你的手势被追踪到。

章节 4 Manipulation(操作控制)

使用操作事件来移动你的全息图,给光标一个样式反馈到用户表明什么时候操作行为被激活。

步骤:

GestureManager.cs 和 AstronautManager.cs 脚本可以实现如下功能:

  1. 使用语音关键字 "Move Astronaut" 来激活 Manipulation 手势.
  2. 选择使用 Manipulation Gesture Recognizer.
  3. 在导航和操作之间切换时管理GestureRecognizer过渡.

开始实现

  • Hierarchy 面板中, 选择 Managers 对象.
  • 在右侧的 Inspector 面板中, 点击 Add Component 按钮.
  • 在搜索框输入 Astronaut Manager. 选择此结果.
  • Hierarchy 面板, 点击 Cursor.
  • Project 面板下到 Holotoolkit\Input\Prefabs 文件夹中找到 PathingFeedback asset.
  • 拖拽PathingFeedback asset 到Inspector面板下的Cursor States (Script)组件中的Pathing Detected Asset 属性.

接下来需要再次编辑GestureAction.cs脚本文件

using UnityEngine;

/// <summary>
/// GestureAction performs custom actions based on
/// which gesture is being performed.
/// </summary>
public class GestureAction : MonoBehaviour
{
    [Tooltip("Rotation max speed controls amount of rotation.")]
    public float RotationSensitivity = 10.0f;

    private Vector3 manipulationPreviousPosition;

    private float rotationFactor;

    void Update()
    {
        PerformRotation();
    }

    private void PerformRotation()
    {
        if (GestureManager.Instance.IsNavigating &&
            (!ExpandModel.Instance.IsModelExpanded ||
            (ExpandModel.Instance.IsModelExpanded && HandsManager.Instance.FocusedGameObject == gameObject)))
        {
            /* TODO: DEVELOPER CODING EXERCISE 2.c */

            // 2.c: Calculate rotationFactor based on GestureManager‘s NavigationPosition.X and multiply by RotationSensitivity.
            // This will help control the amount of rotation.
            rotationFactor = GestureManager.Instance.NavigationPosition.x * RotationSensitivity;

            // 2.c: transform.Rotate along the Y axis using rotationFactor.
            transform.Rotate(new Vector3(0, -1 * rotationFactor, 0));
        }
    }

    void PerformManipulationStart(Vector3 position)
    {
        manipulationPreviousPosition = position;
    }

    void PerformManipulationUpdate(Vector3 position)
    {
        if (GestureManager.Instance.IsManipulating)
        {
            /* TODO: DEVELOPER CODING EXERCISE 4.a */

            Vector3 moveVector = Vector3.zero;

            // 4.a: Calculate the moveVector as position - manipulationPreviousPosition.
            moveVector = position - manipulationPreviousPosition;

            // 4.a: Update the manipulationPreviousPosition with the current position.
            manipulationPreviousPosition = position;

            // 4.a: Increment this transform‘s position by the moveVector.
            transform.position += moveVector;
        }
    }
}

GestureAction

部署发布:

1.发布完成后,在你的设备前移动你的手,伸出你的食指指向天空以便手势被追踪到

2 将凝视光标指向太空宇航员

3 说“Move Astronaut”来用操作手势移动宇航员

4光标周围应出现四个箭头,表示程序现在将响应操作事件。

5把你的食指放在你的拇指上,并保持他们捏在一起。

6当你移动你的手,宇航员也会移动(这是操作)。

7抬起你的食指,停止操作移动宇航员。
8注意:如果您在移动手之前不说“Move Astronaut”,则会改用导航手势。

章节5 模型扩展

使用爆炸动画效果将宇航员拆分为很多小的碎片,每一个小碎片都可以进行移动和旋转操作。

爆炸操作使用语音指令“Expand Model

恢复原模型使用语音指令“Reset Model

编辑AstronautManager.cs 代码

using HoloToolkit;
using System.Collections.Generic;
using System.Linq;
using UnityEngine;
using UnityEngine.Windows.Speech;

public class AstronautManager : Singleton<AstronautManager>
{
    float expandAnimationCompletionTime;
    // Store a bool for whether our astronaut model is expanded or not.
    bool isModelExpanding = false;

    // KeywordRecognizer object.
    KeywordRecognizer keywordRecognizer;

    // Defines which function to call when a keyword is recognized.
    delegate void KeywordAction(PhraseRecognizedEventArgs args);
    Dictionary<string, KeywordAction> keywordCollection;

    void Start()
    {
        keywordCollection = new Dictionary<string, KeywordAction>();

        // Add keyword to start manipulation.
        keywordCollection.Add("Move Astronaut", MoveAstronautCommand);

        // Add keyword Expand Model to call the ExpandModelCommand function.
        keywordCollection.Add("Expand Model", ExpandModelCommand);

        // Add keyword Reset Model to call the ResetModelCommand function.
        keywordCollection.Add("Reset Model", ResetModelCommand);

        // Initialize KeywordRecognizer with the previously added keywords.
        keywordRecognizer = new KeywordRecognizer(keywordCollection.Keys.ToArray());
        keywordRecognizer.OnPhraseRecognized += KeywordRecognizer_OnPhraseRecognized;
        keywordRecognizer.Start();
    }

    private void KeywordRecognizer_OnPhraseRecognized(PhraseRecognizedEventArgs args)
    {
        KeywordAction keywordAction;

        if (keywordCollection.TryGetValue(args.text, out keywordAction))
        {
            keywordAction.Invoke(args);
        }
    }

    private void MoveAstronautCommand(PhraseRecognizedEventArgs args)
    {
        GestureManager.Instance.Transition(GestureManager.Instance.ManipulationRecognizer);
    }

    private void ResetModelCommand(PhraseRecognizedEventArgs args)
    {
        // Reset local variables.
        isModelExpanding = false;

        // Disable the expanded model.
        ExpandModel.Instance.ExpandedModel.SetActive(false);

        // Enable the idle model.
        ExpandModel.Instance.gameObject.SetActive(true);

        // Enable the animators for the next time the model is expanded.
        Animator[] expandedAnimators = ExpandModel.Instance.ExpandedModel.GetComponentsInChildren<Animator>();
        foreach (Animator animator in expandedAnimators)
        {
            animator.enabled = true;
        }

        ExpandModel.Instance.Reset();
    }

    private void ExpandModelCommand(PhraseRecognizedEventArgs args)
    {
        // Swap out the current model for the expanded model.
        GameObject currentModel = ExpandModel.Instance.gameObject;

        ExpandModel.Instance.ExpandedModel.transform.position = currentModel.transform.position;
        ExpandModel.Instance.ExpandedModel.transform.rotation = currentModel.transform.rotation;
        ExpandModel.Instance.ExpandedModel.transform.localScale = currentModel.transform.localScale;

        currentModel.SetActive(false);
        ExpandModel.Instance.ExpandedModel.SetActive(true);

        // Play animation.  Ensure the Loop Time check box is disabled in the inspector for this animation to play it once.
        Animator[] expandedAnimators = ExpandModel.Instance.ExpandedModel.GetComponentsInChildren<Animator>();
        // Set local variables for disabling the animation.
        if (expandedAnimators.Length > 0)
        {
            expandAnimationCompletionTime = Time.realtimeSinceStartup + expandedAnimators[0].runtimeAnimatorController.animationClips[0].length * 0.9f;
        }

        // Set the expand model flag.
        isModelExpanding = true;

        ExpandModel.Instance.Expand();
    }

    public void Update()
    {
        if (isModelExpanding && Time.realtimeSinceStartup >= expandAnimationCompletionTime)
        {
            isModelExpanding = false;

            Animator[] expandedAnimators = ExpandModel.Instance.ExpandedModel.GetComponentsInChildren<Animator>();

            foreach (Animator animator in expandedAnimators)
            {
                animator.enabled = false;
            }
        }
    }
}

AstronautManager

发布部署到设备:

  • Expand Model 来观看爆炸效果.
  • 使用 Navigation 操作来旋转每一个碎片.
  • Move Astronaut 出现移动光标时,使用 Manipulation 操作移动每一个小碎片.
  • Reset Model 将模型恢复到初始状态.

原文链接:https://developer.microsoft.com/EN-US/WINDOWS/HOLOGRAPHIC/holograms_211

如有翻译上的错误请指正。谢谢哦

时间: 2024-10-12 15:39:36

微软Hololens学院教程-Hologram 211-Gestures(手势)的相关文章

微软Hololens学院教程-Hologram 212-Voice(语音)

语音输入是我们操作全息对象的另一种交互方式,语音指令在实际操作过程中是非常自然和容易的,设计语音指令需要考虑以下几点: 自然的 容易记住的 上下文一致 与同一上下文中的其他选项有足够的区别 在Holograms 101的教程里,已经使用关键字识别构建了两个简单的语音指令,这节教程将更深入的学习语音输入相关的知识: 设计为HoloLens语音引擎优化的语音指令. 使用户知道什么语音指令可用. 确认Hololens已经听到了用户的语音指令. 使用听写识别器(Dictation Recognizer)

微软Hololens学院教程-Hologram 230-空间映射(Spatial mapping )

空间映射地图是将真实环境的环境信息扫描到设备中,使得全息对象可以识别真实场景环境,从而达到可以将虚拟对象与真实世界相结合的效果.这节教程主要学习内容如下: 使用Hololens扫描空间环境并将空间数据导入到开发计算机中. 学习利用shader给空间网格赋予材质以便其更容易被发现. 使用网格处理方法将网格变成简单的平面. 对全息对象可以放置的位置进行放置提醒,使得用户更容易的放置. 开发遮挡效果,即当全息对象被真实场景中的物体遮挡时,你仍然可以看见它,只不过它是线框模式的. 项目文件: Downl

微软Hololens学院教程-Hologram Gaze(凝视)

Hololens的使用如果类比到计算机的使用,在输入操作方面,Hololens了解用户的操作意图的第一个步骤是凝视,用户的凝视射线呈现在场景中的点为凝视点,就好像是电脑中的鼠标光标点,凝视是第一步,是人与hololens操作的开始. 涉及凝视相关的知识点如下: 1 当用户看着一个全息图时,光标点会有反馈表现-表明用户看到了全息图,当用户凝视视线离开全息图时,光标点也要有反馈-表明用户没有在看全息图. 2 当用户注视到全息图时,给于用户更多反馈,例如声音,全息图当变化等. 3 使用定位技术使得用户

微软Hololens学院教程- Holograms 101: Introduction with Device

这篇文章将通过一个完整的实例来了解设备的核心特性,包括凝视,手势,声音输入和空间声音与空间映射.先决条件 1.一个 Windows 10 PC 并安装有相应的软件tools installed..2. 开发者模式的HoloLensconfigured for development. . 项目文件 1.下载此项目所需文件files. 2.将其保存在桌面或者其他易于找到的位置, 保持文件名为 Origami. 章节 1."Holo"world 目标     1设置Unity.     2

Android基础新手教程——3.8 Gestures(手势)

Android基础新手教程--3.8 Gesture(手势) 标签(空格分隔): Android基础新手教程 本节引言: 周六不歇息,刚剪完了个大平头回来.继续码字~ 好的,本节给大家带来点的是第三章的最后一节--Gesture(手势), 用过魅族手机的朋友相信对手势肯定是不陌生的.在home键两側像屏幕内滑动, 能够打开后台任务列表等等~在应用中通过手势来操作会大大提升用户体验. 比方Scroll手势在浏览器中个滚屏,Fling在浏览器中的换页等! 当然,有利也有弊,比方不当的手势操作引起AP

Android基础入门教程——3.8 Gestures(手势)

Android基础入门教程--3.8 Gesture(手势) 标签(空格分隔): Android基础入门教程 本节引言: 周六不休息,刚剪完了个大平头回来,继续码字~ 好的,本节给大家带来点的是第三章的最后一节--Gesture(手势), 用过魅族手机的朋友相信对手势肯定是不陌生的,在home键两侧像屏幕内滑动, 可以打开后台任务列表等等~在应用中通过手势来操作会大大提升用户体验, 比如Scroll手势在浏览器中个滚屏,Fling在浏览器中的换页等! 当然,有利也有弊,比如不当的手势操作引起AP

微软Hololens设备 浅分析

微软Hololens的定位是一款MR 设备(Mixed reality).MR与AR的不同我认为是MR能够将真实环境的场景信息与虚拟对象进行完美的融合,它是基于SLAM(SimultaneousLocalization and Mapping)实时定位和场景建模的技术,使得设备能够识别周围环境,并且准确知道用户在场景中的位置,增强了用户使用的真实感. Hololens本身设备是一台头戴式的计算机,其配备一个半透明/分辨率极高的See-Through屏幕,整个设备集成了所有的零件,具体的详细配置见

北京动软VAR团队的HoloLens开发教程最新搜罗整理

日前,微软为Windows开发者带来Win10版HoloLens全息眼镜模拟器SDK开发套件工具,借助最新发布的VS2015 Update2和Win10 SDK工具,直接在PC平台上开发和调试原生Win10全息应用或游戏,不容错过. 本款HoloLens emulator模拟器开发套件基于微软Hyper-V虚拟化技术打造,支持使用RemoteFx物理图形加速功能,因此需要一台原生64位系统.最低8G内存要求的Win10专业版/企业版平台. 这款HoloLens全息眼镜模拟器可以在Win10虚拟机

快速了解SignalR—在MVA微软虚拟学院学习SignalR

©版权声明:本文为博主原创文章,如需转载请注明出处. SignalR把实时Web功能变得异常简单. 如果您希望在几个小时内对SignalR有一个直观的了解,观看微软虚拟学院(MVA)的视频教学Lighting Up Real-Time Web Communications with SignalR或许是个不错的选择. 这是一个大概四个小时的视频教学,分为五个部分.除了视频,还配有PPT和小测验.视频是英文的,配有英文字幕.通过这几个小时的学习,您将对SignalR的功能以及如何使用有一个初步的了