Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins
Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins Neo Kinect | Code Plugins

With the Neo Kinect plugin you can use the Kinect v2 sensor advanced capabilities within Unreal Engine, with easy to use Blueprint nodes, all nicely commented, or directly through the C++ methods. And starting with Unreal Engine 5, the plugin is fully compatible with DX12 and all its perks, like ray tracing and Lumen.


Examples and Quick Start doc

Quick Start guide: NeoKinect-QuickStart.pdf


Example Project for Unreal Engine 5: NeoKinectExamples.zip.

  • This example is not yet using the new demo room from UE5, only the new skeleton (the UE4 one is still there as well).


Example Project for Unreal Engine 4: NeoKinectExamples_UE4.zip.


Videos of projects made with this plugin:

https://www.youtube.com/watch?v=RyDmDAZYbAE

https://www.youtube.com/watch?v=x0RH_-xRhFU

https://www.youtube.com/watch?v=59fnWYRQN88


FAQ

If you have a question about a problem with the plugin, maybe it's been answered here, at the end of the first post, in the FAQ section.


Release Notes

Minor engine version updates are not listed here as changes. Only changes related to functionality and fixes.


Unreal 5.0 release (April 4, 2022)

Front-end (project and Blueprints):

  • Frames are now returned as TextureRenderTarget2D from SetUseFrame. Widgets accessing the frame textures need to be updated as before a cast to a custom texture format was required. That custom format no longer exists as its customizations are no longer needed. As a bonus, that enables the next item in this list.
  • Now compatible with DX12 and all its perks, like Ray Tracing, Lumen and so on!

Code:

  • Fixed a memory leak when enabling frame types.
  • Lots of IWYU improvements, with less included clutter in public headers.


Features Overview


Robust and fast

The plugin was created with performance and usability in mind, so you can track all 6 possible users, their faces and enable all of the Kinect's frame types (color, depth, infrared etc) at the same time with almost no hit in performance. Sensor polling is made in its own thread. The frames textures (color camera, depth camera, IR and their variations) are TextureRenderTarget2D (UE5+ only. A custom texture format was used on UE4), which is compatible with materials, UMG Widgets and whatnot. There are functions to access the textures pixel values from Blueprint as well.


No need for components

The sensor is unique, no matter how many Actors or Widgets are using it. So, instead of needing to add components, you just call functions like with a function library. That way you can control the device from any Blueprint, including Widgets.


Advanced Remapping

Besides access to the standard Microsoft Kinect API coordinate remapping methods, the plugin also comes with other remapping features that facilitate AR applications, like getting the location of a joint in the Color frame without losing its depth information. Every location and orientation was adapted to Unreal's coordinate system and Joints transforms are compatible with the Engine's Mannequin character rig.


Fully production proven

I've used Neo Kinect a lot (more than a year) before releasing to the public and removed all bugs found so far, besides making a lot of performance improvements. It was used in applications that go through a whole day without crashing and packages without problems.

Technical Details

Body tracking:

  • Tracking of up to 6 simultaneous users's skeletons, with 25 joints each
  • Users leaning angle, tracking confidence, Body edge clipping, hands states
  • Per Body found/lost events


Face tracking:

  • Location and orientation of up to 6 simultaneous users's faces
  • Face points (left and right eyes, nose and left and right mouth corners) in 3D and 2D (Color and Infrared space)
  • Faces bounding boxes in Color and Infrared frames space
  • Expressions reading (Engaged, Happy, Looking Away, Mouth Moved, Mouth Open and Left and Right Eyes Open/Closed) and if users are Wearing glasses or not
  • Per Face found/lost events


Sensor control:

  • Global bodies/faces tracking events (found/lost)
  • Init/Uninit sensor
  • Get sensor tilt, ground plane normal and sensor height


Remapping:

  • 3D camera location to Color texture (optionally with depth) and to Depth texture
  • Find depth of a Color texture location
  • Depth point to Color point and to 3D location


Frames (textures):

  • Get each frame FOV and dimensions
  • Toggle frames usage individually
  • Sample a pixel value from the Depth frame and find the depth of a Color pixel


Network Replicated: No

Platform: Win64 only