Week 2 ( 02/09/2024)
We decided to work on exploring other alternatives and do a study on which might be a better tool for us to use going forward.
I gathered a comparative study from online reviews on Unreal v Unity (source):
Software | Unreal Engine | Unity |
---|---|---|
Features | Unreal Engine AR Framework | |
Unreal Engine AR Framework is a set of features and plugins that enable you to create AR applications using Unreal Engine. Unreal Engine AR Framework supports various AR platforms and devices, such as ARKit, ARCore, Magic Leap, and HoloLens. You can use Unreal Engine AR Framework to access the native features and functionalities of each platform, such as face tracking, plane detection, image recognition, and spatial mapping. You can also use Unreal Engine AR Framework to leverage the advanced rendering, lighting, and animation capabilities of Unreal Engine, to create realistic and immersive AR experiences. | Unity MARS | |
Unity MARS (Mixed and Augmented Reality Studio) is a dedicated extension for Unity that simplifies and streamlines the AR development process. Unity MARS allows you to create AR scenes that adapt to the real world, using data from sensors, cameras, and location services. You can also use Unity MARS to test and preview your AR scenes in different environments and conditions, without leaving the editor. Unity MARS integrates with other Unity features, such as Timeline, Cinemachine, and Shader Graph, to enhance the visual and interactive quality of your AR scenes. | ||
Review | Although Unreal is my primary option whenever I need an engine, I don't recommend it for AR. Its code for the ARKit and ARCore are outdated and even had some bugs in the plane detection recently that would prevent you from tracking vertical planes correctly. If your app application is simple and only needs horizontal plane detection, then you're safe using Unreal. If you need the latest updates from the AR hardware libraries, better detection, or even some shader effects (Unreal for instance can't do the portal effect for mobile), then I would advise using something else as Unity for instance. | Unity Sentis will be a game changer for any organization looking to get Large Langue Models into their Unity Deployments. |
Implementing this new tool set is free for now and provides a ton of value from NPCs or enhanced features. | ||
Features | Unreal Engine Blueprint | |
Unreal Engine Blueprint is a visual scripting system that allows you to create logic and behavior for your AR applications, without writing any code. Unreal Engine Blueprint uses nodes and wires to represent functions, variables, events, and data flow. You can use Unreal Engine Blueprint to create complex AR interactions, such as gesture recognition, object manipulation, UI elements, and audio effects. You can also use Unreal Engine Blueprint to debug and optimize your AR applications, using tools such as breakpoints, watches, and performance analyzers. | Unity AR Foundation | |
Unity AR Foundation is a cross-platform API that abstracts the differences between various AR platforms and devices, and provides a common interface for accessing AR features and functionalities. Unity AR Foundation supports ARKit, ARCore, Magic Leap, HoloLens, and more. You can use Unity AR Foundation to create AR applications that work across multiple platforms and devices, without having to write platform-specific code. You can also use Unity AR Foundation to integrate with other Unity packages, such as XR Interaction Toolkit, ARKit Face Tracking, and ARCore Depth API, to add more functionality and interactivity to your AR applications. | ||
Review | The Blueprint system from Unreal is very easy to use. For those not familiar with programming, it's a great start. It has several pre-built functions that make it easier to assemble whatever you want at the same time it also has more advanced nodes that give you control over more in-depth tools to achieve complex results. | |
Nowadays, there's a lot of material on the internet, from Epic or other creators/developers, to start working with Blueprints. The templates provided by Epic itself inside Unreal for AR and VR are also a great start to getting familiar with how things work and checking some examples. | I recommend using Unity for AR content, as I said previously in the Unreal AR section. The Unity AR Foundation is more up-to-date with what's latest for AR content, so you'll have better options for implementing detections and interactions using it. It works really well for both Android and iOS, most of the time being able to export the same code for both platforms. | |
Unity's AR Foundation is incredibly useful in building cross platform AR experiences. AR Foundation abstracts features like image tracking and plane tracking and then makes it possible to write the code once and publish your experience to both Android and iOS. | ||
ARKit currently has some AR features that can be used when targeting iOS, AR Foundation makes it possible to detect and use these features if they are available. | ||
Features | Unreal Engine Niagara | |
Unreal Engine Niagara is a particle system that allows you to create stunning and realistic effects for your AR applications. Unreal Engine Niagara uses a modular and data-driven approach to create and control particles, such as sparks, smoke, fire, water, and more. You can use Unreal Engine Niagara to create AR effects that react to the real world, such as lighting, physics, and collisions. You can also use Unreal Engine Niagara to customize and optimize your AR effects, using tools such as graphs, modules, and presets. | Unity ML-Agents | |
Unity ML-Agents is a framework that enables you to use machine learning and artificial intelligence in your AR applications. Unity ML-Agents allows you to train and deploy intelligent agents that can learn from their own actions and the feedback from the environment. You can use Unity ML-Agents to create AR applications that have adaptive and dynamic behaviors, such as character animation, object detection, navigation, and dialogue. You can also use Unity ML-Agents to integrate with other machine learning frameworks, such as TensorFlow, PyTorch, and ONNX, to leverage the latest research and models. | ||
Review | Niagara truly is a great feature. You can build any type of VFX there from simple particles to Hollywood or AAA game level. It does require a bit of patience to start and understand all its options, but it also has some templates to help you start and tons of online material to help you design your effects. | |
About using it for AR content, I would advise you to really give some time to study the tool so you learn the ways to optimize your effects. Remember that most AR platforms run on mobile devices and their hardware and shaders won't have the same processing power and features available on desktop. Also, don't forget to turn the mobile shader preview on your engine menu so you can really see how your effect will look on a mobile platform. |
After discussing everyone else’s study as well, we decided that Unity might be better suited for our project!
https://lh7-us.googleusercontent.com/NMMNi1bc47OHHGPJPldIQqHaKer46HV3pB3vSE7CbYtR7mpJvdVW2pFrdGX0umt1xauNo894v8sWtDXYPFH9bVwlIjzLAfgy6CjuMXRO8xmI-aevfPuPD4uAWCabXQtm7WQ-Oe-_qiA_FVtWmA8qGeE