The Fectar Spatial Engine employs a JavaScript-based scripting language for developing interactive XR experiences. This language is designed to be familiar to developers who have experience with modern web technologies and game development platforms like Unity. Below are some key aspects of the language and coding environment:
1. JavaScript Syntax
The scripting language in the Fectar Spatial Engine is a Sandboxed JavaScript environment, featuring syntax and constructs such as variables, functions, and object-oriented programming (OOP) concepts. Developers who are familiar with JavaScript will find coding within the Fectar Spatial Engine straightforward.
2. Learning and Development Tools
The engine comes with a library of code snippets that developers can use as starting points for their projects. These snippets not only provide ready-made functionality but also serve as learning tools, allowing developers to see how specific features are implemented and modify them according to their needs.
3. Modules and Classes
The engine is organized into various modules, each containing classes and methods that are specific to different functionalities, such as animation, interaction, events, and particle systems. For example:
-
Core Module: Includes fundamental classes like
Vector3
,Vector4
, andQuaternion
for handling 3D vectors, 4D vectors, and rotations. - Interaction Module: Contains classes that manage user interactions within the XR environment.
- Event Module: Handles event-driven programming, such as responding to user inputs or collisions.
- Animation Module: Manages object animations, allowing for dynamic and interactive scenes.
4. Functional Programming Support
In addition to OOP, the engine also supports functional programming paradigms, allowing developers to write more declarative and concise code.
5. Extensibility and Customization
The engine allows for deep customization through its API, providing the flexibility needed for advanced development.
6. Event-Driven Architecture
The Fectar Spatial Engine uses an event-driven architecture, where events such as user interactions, collisions, and animations can trigger specific functions. This allows for responsive and interactive XR experiences that react in real-time to user inputs.
7. Integrated Particle System
The engine includes a built-in particle system to create complex visual effects such as fire, smoke, and snow.