Muhammad Imran Majid1,2*, Ali Gauhar2, and Aamir Rasool2
1Visiting Associate Professor School of Engineering University of Warwick, UK
2Department of Electrical Engineering, Institute of Business Management, Karachi, Pakistan
* Corresponding Author: [email protected]
INDEX TERMS complementary filter, Degree of Freedom (DoFs), host computer, Inverse Kinematics (IK), Inertial Measurement Unit (IMU), Internet of Things (IoT), Unity 3D
Motion capture (mo-cap or mocap) is a technique, which refers to the motion of an object or human. It has innumerable applications in different fields, such as entertainment, military, medical, sports, robot validation and computer vision. Primarily, it is used in video games and filmmaking production to capture and record human actions to animate digital character models in 3D or 2D on the host computer. There are various methods to involve human- computer interface but motion capture (mocap) has been a well-researched topic with several applications in virtual reality, decision-making, health monitoring, demand reliability, and quick response upon input changes. In other words, it can be defined as a technique to digitally capture the movement of objects or humans with the assistance of a sensor mounted on a target using simulation tools. Augmented Reality (AR) is an instance of pervasive computing technology, where designers enhance parts of users real world interaction using computer-generated input. AR interfaces, unlike interactive Virtual Reality, allow users to see the physical world, while simultaneously displaying virtual imagery attached to real locations and objects with the help of Inverse Kinematics (IK). [1] IK is the mathematical process of calculating the variable joint parameters, which helps to achieve the desired endpoint. This technique was developed for gait analysis in the life sciences industry, however, it is now widely used by VFX studios, sports therapists, and neuroscientists for computer vision, robotics validation, and control [2].
Setting all DoFs (Degrees of Freedom) in an automated fashion, Inverse Kinematics (IK) technique requires end effectors which are commonly called positions and orientation of used joints. The user has to specify the end effectors the remaining DoFs would automatically be observed according to the criteria of the user, which would depend upon the variants of Inverse Kinematics.
There is a deficiency in the ability to reduce noise to imitate motion capture in current systems. Further, there is a lack of calibration data available for offset bias assessed with the help of COTS parameters.
The aim of the current study is to provide a user-friendly mocap system with the following qualities
The suggested prototype is designed to measure the six degrees of freedom and capture the movement of any object with sensor detection. The prescribed prototype has two sensors and one microcontroller connected to the host computer, the mocap sensor collects the values of all parameters (6 DOFs) and sends them to the controller, which then processes them according to the program uploaded. The parameter it extracts are as follows:
The prototype can be simply maintained and implemented anywhere; though a hosting computer and a good internet connection is essential for the working environment.
Motion capture technology implies that video games are truly vivid and real-life. The project aim, scope and applicability are discussed. The current study is based on IoT, therefore, the project is capable of extracting 6 DoFs and placing them on the cloud, along with capturing an object's motion and translating it onto a 3D skeleton[4]. Essentially, the project focuses on a prototype, however, future researchers can employ advanced sensors as a developmental factor Furthermore, more financing could result in the structure and functionality of improvement for the project.
Motion capture technique is a movement detection technique, which is proposed to be utilized in capturing, analyzing, and monitoring human or animal motion [5][6] for certain applications within real-time virtual reality [7]. This technique records patterns of movement, especially digital movements [8] for digital characters, movies, and video games. The unique application of sports as well as in performing arts, requires physical mobility hence leading to injury which can be prevent[9]. Here, this current research uses data from animation framework including six degree of freedom and advanced reverse kinematics as well as enhanced processing by using stochastic analysis of noise. The extent to which mo-cap technologies are being used, requires review of current and previous research.
Man has always been compelled to create visual representations of things, which he observes in his surroundings [10]. New inventions and recent developments since the industrial age has enabled man to make use of his five senses [11]. Eadweard Muybridge [12] and Etienne-Jules Marey [13] undertook separate research on animals and human motion by capturing moving objects several times in a quick period and devising equipment to graphically captureand record locomotion likewalking, jumping, running, swimming, flying, gait analysis and so on [14].
The rotoscope was invented by cartoonist Max Fleischer in the 1900 s [15], allowed animators to track actors by using photographs of their live performances. In the mid-1980s, Robert Abel employed performing animation to produce brilliance, which marked the beginning of the second iteration of performing animation in the entertainment industry. This commercial was the first among all to use MoCap for 3D animation [16]. Moreover, Wavefront technologies introduced and demonstrated mainstream 3D character animation in the 1980s, which became the first commercial product available for computer animation [17].In the 1970s and 1980s, mocap began to be used as a photogrammetric analysis technique in the biomechanical field [18] and as the technology advanced, it extended into a variety of fields.
Medicine [19], sports [20], entertainment [21], and law/surveillance are the key sectors, which gain advantages from mocap technique [22]. Smaller sectors are also benefiting from this technology; for instance, motion capture devices are utilized to assist in the design of ergonomic settings [23]. Automobile safety testing, in which the motion of crash-tested automobiles is collected and examined, is another application of mocap system [24].
Motion capture technology, which captures an object's motion, is useful in a variety of industries. O'Rourke [25] and Parent [26] identified two basic techniques, which are widely used in these industries.The first method employs electromagnetic sensors, which communicate their locations and angles to a central processor that records or displays the motion. To connect with the central processor, the sensors require either wired or wireless communications. This process mainly involves the subject being connected to a wiring harness, while the latter needs the subject to carry a source of power or something like a battery pack [26]. Electromagnetic sensors have the benefit of being able to capture and present the 3D position and angle or orientation of each sensor in real time, regardless of the person's posture.
FIGURE 1. (a) Right-hand motion of a human body captured by using an Electromagnetic Motion Capture system. (b) Right-hand motion tracking effect [27].
Many of the reflecting markers may be concealed from the sight of a few of the recording cameras based on the actor's stance and this blockage might cause problems in the data collection process [25]. The optical technique has a drawback of not providing real-time feedback, Since no orientation data was produced immediately, more markers are actually needed other than the magnetic trackers.
FIGURE 2. An Optical Motion Capture system used to capture the complete human body movement [28].
Beckerman [11] divided these two basic methods into child categories. Mechanical exoskeletal systems, which directly track subject joint angles, which are examples of electromagnetic mo-cap. These are usually solid structures made up of plastic or metal bars connected by potentiometers, which articulate at the subject's joints. Magnetic systems, on the other side, use the relative magnetic flux of three orthogonal coils on both the receiver and transmitter to compute location and angle. By precisely mapping the tracking volume, these systems can determine both distance and angle using the relative intensity of the current or voltage of the three (3) coils. Non-metallic objects do not obstruct the markers, though they are subject to magnetic and electrical interruption from metallic objects and electric sources in the surroundings. Miniature inertial sensors, biomechanical models, and sensor fusion algorithms are used in inertial systems. Inertial sensor motion data is usually wireless, which is usually transferred to a processor, where it captures, displays or records the data. Gyroscopes are used in a majority of inertial systems to record rotations, which are then converted into joints of a skeleton in accompanying software. The complete 6 DoFs (Degrees of Freedom) of subject motion can be captured in real-time using an inertial system [11].
The approach of the markers are selected according to the optical mocap systems classification.
Markers covered with a retro-reflective substance are used in passive optical systems to bounce the light back again to the lens of the camera. The markers are often connected to the skin or a full-body spandex suit developed for the mocap system. The threshold of the camera can be set to sample only the bright reflective markings, leaving skin, and cloth out. The centroid (geometric center) of the marker is calculated as a point within the 2D image . A 3D fix can be achieved only if two calibrated cameras have seen a marker. A typical system would have 6-24 cameras. Active optical systems use multiple LEDs to corroborate the relative position of a marker, which is defined by using the software. The markers are driven to release their own light instead of reflecting light, which is produced outside. The light passing from active markers is 1/2 in strength as compared to no markers. with respect of reflective markers. Inverse Square law delivers 1/4 the power at source at twice the distance. Semi-passive imperceptible marker system does not require a lighting device to monitor; instead, tags equipped with photo sensors are used to detect them. Applying specially developed multi-LED infrared projectors or projectors to encode the space optically, the typical technique dependent on high cameras may be inverted. The technology decodes optical signals using photosensitive marker tags rather than retro-reflective or active LED markers. The tag attached to the photo sensors can identify and derive not only their positions in three dimensions but also speed in three vector format, but also various other parameters affecting not only the amount of energy transferred but angle, incident illumination, and reflectance. These indicators can be included in garments to track movement, while being undetectable to the wearer. Choice of Motion Capture Systems
The MoCap system type and its use in a project or study depends on many factors. The type of motion to be captured, as well as the requirements of the subject have a big impact on the chosen system. The size of the accessible capture volume, the overall impact of the hardware on the movements of the subject, and the desired accuracy of orientations of the joints wouldneed to considered in the overall analysis. The budget required for the mocap session would also have a significant impact on the chosen solutions. IT depends on the type of financial incentives that the project offers. Selection of a system is determined by finance, economics apart from technical objectives. These need to be studied in depth to understand the importance of motion capture systems.
Meador [29] presented a cross-departmental initiative in which real-life artists on stage were integrated with virtual actors in a virtual setting controlled by real-time motion tracking to create a public play. This initiative employed the Gypsy three electro-mechanical mocap suit, which used forty-three (43) potentiometers to capture joint orientations. This suit uses an exoskeleton (external skeleton) The range of motion restricts the exo skeleton to move within the perimeter of a fifty-foot radius circle ( battery powered) , or twenty-five feet ( AC mains power). Hence the importance of geometry and the impact of location and range of AC power sources.
Even though the Gypsy suit had benefits including cheap price, quick and easy setup, and mobility, the report stated that due to the mechanics of the suit, it suffered from motion data drift, limits, and noise of the actor's movements.
Motion drift was mentioned as a constant issue and it was claimed that the virtual performer's function was frequently marred by jerkiness and noise. The drifting issue was blamed on the orientation trackers of the suit, according to the study.
John Haag [30] conducted similar research with the inertial mocap technique for live performance and discovered some quite similar issues. A difficulty with relative positioning was also discovered. He further claimed that the accuracy of data was dependent on the precision of the performer's measures. Highly energetic, acrobatic, or twisted movements may not capture correctly, according to Haag, highly energetic, acrobatic, or twisted movements may not capture correctly and actions like jumping and ascending stairs are hard to record. Similar to the analysis of Meador, Haag saw substantial numbers of hardware tied to the suit as an issue. However, he viewed this as a costume limitation instead of a performer limitation . Haag identified numerous good results from the research, some of which were similar to the findings of Meador. The mobility and convenience of set up, as well as the flexibility to run in real-time, were praised. Haag mentioned another positive effect, which is the range of motion of the suit, which counters Meador's concern. According to Haag, could be operated from a range of up to two hundred meters. The difficulty of the virtual space that might be used in the act is also a point of contention among the aforementioned authors. According to Meador, the production has a 2D feel regarding it. Due to simplicity in design as well as that of the imagery scene, the implementing artistic performing arts looks at aesthetics and impact of society is the compromised feature. According to Haag, complex effects, multiple characters, and rich 3D environments can be utilized.
In a study comparable to Haag's, a multidisciplinary team [31] staged a live act in a virtual space employing real actors and digital characters controlled by an actor
wearing an inertial mocap suit. With the Motion Builder software of Autodesk, the movement data was translated to the virtual character. Andreadis, like Haag and Meador, discussed a problem of decreasing positional precision and drift, which got worse over time or was caused by electromagnetic interference. Andreadis's inquiry yielded promising results, including real-time communication between the real-life performers and avatars, which concurred with Haag that a complex virtual space with visual effects might be constructed to improve the live performance.
Linden [32] published a study to share the vibrotactile feedback, which was supplied in real-time to novice violin players. The feedback was calculated based on the bowing gesture of the player to the desired trajectory. Here any sort of deviation from the desired trajectory results in deviation leading to a fault . Using an Animazoo Lycra suit, movement data was captured using IMU (Inertial Measurement Units). The findings of the study indicated that participants who got real-time feedback carried on to enhancing their bowing skills. long after they had completed receiving their real time feedback from participants. was Another reason for the failure was that the measurements were taken only on a sample of eight participants and concluded in six sessions, The cumulative impact of the technique and its application in a long-term, real-world teaching scenario are unknown.
The impacts of valgus pressure on baseball throwing-related elbow damage are studied using an optical system by Aguinaldo [33].
According to the findings, the pitching method has a substantial relationship with the torque of the elbow valgus. Aguinaldo examined the throws of 69 baseball players, a far larger sample size than Linden. There was no real-time feedback because of the used optical system. Techniques related to elbow damage caused by elbow valgus torque have been studied in detail, but no strategies to reduce the effect have been investigahted or established.
The impacts of balancing and plyometric exercise on resulting lower limb kinematics were tested using a camera-based mocap approach in a study by evaluating the training techniques to prevent ACL (Anterior Cruciate Ligament) injury in female athletes [34]. The system by "Motion Capture Corporation, California, USA", which was also used and accepted in previous research [35], comprised 8 high-speed cameras with every subject equipped with 37 retroreflective markers, which were recorded using EVaRT software, along with 4.5 x 2 x 2.5 cubic meters optimal capture volume. After the due selections, the study team included eighteen and fourteen to seventeen-year-old females who noticed and reported changes in lower extremity coronal plane mechanics, leading to the participants landing with less valgus motion and having a lower risk for injury. According to the research, each training team got feedback in various ways, making it difficult to assess the impact of the training and feedback methods on its outcomes.
The global 3D mocap industry was estimated at $116.7 MN in 2016 and is expected to increase at a CAGR (Compound Annual Growth Rate) of 10.6% from 2017-2024, reaching $258.6 MN in global income by 2024 .
Figure 3 shows that the market is divided into five regions: Latin America, Europe, North America, the Asia Pacific, and The Middle East and Africa. Amongst all North America held the largest market share as indicated by results of 2016, which is expected to increase significantly over the projected period of 2024. This is due to the region's strong implementation of new technological advancements, as well as the substantial presence of industry participants. Europe and The Asia Pacific, on the other hand, were reportedly expected to thrive in the upcoming years, with CAGRs of 10% and 11.6%, respectively, throughout the projected period.
Table I shows some popular mocap system distributors. Taking the example of a stable internet connection, which is easy to purchase and accessible to anyone.
FIGURE 3. Global 3d motion capture market, by Region, 2014-2024 (in MN USD) [36].
TABLE I
MOTION CAPTURE SYSTEM PROVIDERS FROM ACROSS THE WORLD WITH SUIT PRICE
S. No. |
Vender Name |
Vender Location |
Motion Capture Suit Price |
1 |
OptiTrack [52] |
United State |
$ 325 |
2 |
Xsens Technologies [53] |
Netherlands |
$ 3,790 |
3 |
Rokoko [54] |
San Francisco, USA |
$ 3495 (Body + Hands Bundle) |
$ 2745 (Only Body Tracking) |
|||
4 |
Noitom [55] |
China |
$1,500 |
An advantage of the proposed prototype is that it is a two-in-one project. Firstly, it captures the motion of the object. Secondly, it tracks the six DoFs, compares them by performing calculations, and performs noise analysis. Xsens Technologies, has a huge market in different countries including India, Malaysia, Japan, USA, and many more [37]. /As the primary distributor of this system, Xsens makes sure the users develop software for their system. This is done by once a user pays for the software to get a framework to be followed during the motion tracking. As far as the software is concerned, this system uses a COTS product that is a readily available software (Unity 3D) for motion capture and works in any type of environment. With respect to cost, the manufacturing cost of our prototype (pair of sensors) is around $15. This can be increased if one use the high-quality sensors and other components. Users can purchase the full-body suit for around $70 to $80. OptiTrack delivers the cheapest mocap suit for $325. As discussed the proposed system can be used to track the six DoFs and calculate the noise to improve and prepare for the next time.
The current study discussed motion capture, its uses, innovations, and its recent market trends. These results were confirmed by a graph, which showed that the worldwide 3D mocap industry market is expected to grow at a CAGR of 10.6% from 2017-2024.
In recent times, the demands f motion tracking systems are increasing [37]. This trend is driven by the medical and game industries. In medical industry, a motion-tracking system is used to help patients in the rehabilitation process, which is an essential application of the Mocap system. On the other hand, the gaming industry use this system to create a sophisticated virtual world designed to operate a game character. The microcontroller to be used should be flexible, lost cost, and IoT enabled and would be able to map various programming interfaces available. NodeMCU fits the criteria and is preferred in both industries.
FIGURE 4. Arm motion capture system
Inertial Measurement Unit (IMU) is a device that is capable of estimating the orientation of a rigid body. The design consists of two sensors, namely gyroscope, an accelerometer, and microcontroller for data processing. Besides IMU-based motion capture systems, there are a number of technologies and approaches to capture the motion of a rigid body [36], such as optical sensing-based, mechanical sensing-based, and acoustic sensing-based setups. In comparison`, IMU- based system is preferable due toaccuracy and precision in results where there is a lot of motion simulation involved . The average in situ measurements are also in the order of human detection. In order to track human motion, multiple IMU devices are required to capture the motion. For instance, as shown in Figure 4, two IMU devices are placed on the lower arm and the upper arm to track a human arm motion.
Figure 5 represent a detailed flow chart for real-time MoCap system where hardware and software are mounted and noise measurements observed
FIGURE 5. The process flow of the Real-Time Motion Capture System.
Figure 6 shows the architecture of 6 degree of freedom emulated on the human arm and mapped using COTs.
The prototype only consists of two MPU- 6050 sensors and one (1) microcontroller, which is selected as NodeMCU. This sensor was selected because of its cost-effectiveness in IoT edge node analysis and implementation.
On a single chip, there are usually three axes gyroscope sensors, three axes accelerometers, and a Digital Motion Processor (DMP) [38]. I2C protocol was used to transfer data and features three sixteen-bit analog or digital converters for the data quantization of accelerometers and gyroscope sensors. The sensor can be used for both slow and fast modes approaches.
±2g, ±4g, ±8g, ±16g
The ESP8266 is an SoC (System on a Chip) made by Espressif, which is a Chinese firm. The ESP8266 Node-MCU is an extensively utilized development board, which manages to combine features of simple programming with the help of Arduino IDE (C++/C) along with good WiFi functionality. Primarily, it is focused on the ESP-12E WiFi Module.
With the built in programs research, development, testing, and prototype projects are made simple and elegant in design. In this case, a CH340G USB-to- Serial chip, allows for easy flashing of the ESP8266 and sequential output on a Computer.
The complementary filter as shown in Figure 3.6 comprises a high-pass and a low-pass filter, which is a computationally affordable sensor fusion approach [39]. The complementary filter's concept is used to blend the fast-moving impulses from a gyroscope and slowly moving signals through an accelerometer.
1) ACCELEROMETER AND GYROSCOPE
Positional sensors have two types, one that measures an object's translational motion and the other one, which measures an object's angular shift [40]. This section would cover the mathematical modeling of the gyroscope and angular shift of the object in question. As discussed earlier, the sensor has a three-axis accelerometer, three-axis gyroscope, and a DMP. The sensor is accurate in both fast and slow motions. The scales of the gyroscope are ±250, ±500, ±1,000,or ±2,000 dps (Degrees per Second). Furthermore, the measurement ranges for the accelerometer are ±2g, ±4g, ±8g, or ±16g, while 1 g represents the gravitational acceleration, which is roughly 9.81 m/s2.
Primarily, accelerometers are used to evaluate the acceleration of the object. Some of the most common applications are slope sensors, vibration sensors, and positioning systems. The smartphone screens, which spin their presented image to adjust themselves as per the current angle of the gravitational acceleration, are a well-known application of accelerometers. Acceleromaters are small devices whose function is that it detects variations in position and orientation with respect to the cartesian coordinate and alerts using a signal transmission to the display screen to rotate.
Consider an accelerometer in horizontal positioning, such as on a table,. Whenever the sensor is tilted the g force would be distributed over to the other two perpendicular axes. As a result, trigonometry can be used to determine angles [41].
Figure 10 shows the tilt angle recorded along a single axis (the x-axis). The sine of the angle among both the accelerometer x-axis and the horizon gives an output acceleration of the gravity vector on the x-axis, which is equal to the projection as per fundamental trigonometry [42]. The horizon is usually defined as the plane perpendicular to the gravity vector. The output acceleration for an ideal value of 1g for gravity is:
A(x,out)[g]=1g×sin(θ) (3.1)
Fig. 9 demonstrates different sensor configurations and Eqs. (3.1-3.3) to determine the angles. For the X, Y, and Z-axes, accelerometers provide Ax, Ay, and AZ readings.
ρ=tan-1√(Ax/(Ay2+(Az2) (3.2)
∅=tan-1√(Ay/(Ax2+(Az2) (3.3)
θ=tan-1√(Az/(Ax2+(Ay2) (3.4)
The values achieved via MPU-6050 are raw values. Raw values are dimensionless quantities, which are converted to m/s2 or multiples of g. According to the datasheet, the sensor divides the computed value by 16.384 for a ±2g reading measuring range. As a consequence,, if the sensor produces a reading of 16.384 on any axes, that axis contains 1g of force acceleration with respect to other axis in question.
Angles can be measured either after or before this division. The motion in three dimension, causes effect on various sensors and displacement measurement devices. The motion causes some new forces on the accelerometers in addition to the gravitational forces. The computed value is not accurate for sustained accelerations in horizontal plain. This happens which does not require angular correction and help releive complexity on the system analysis. The gyroscope further helps to reduce such effect.
The sensor body orientation is maintained or measured using the gyroscope. The concept of the gyroscope is simple and clear, which the sensor measures according to the quantity of change in the segment and its orientation. Following the calculation of this parameter, the angular velocity in degrees per second is measured and the angular shift is calculated as per previous known references [42] [43].
To compute orientation, it is extremely important to first set the sensor to a given position by using the accelerometer and then use the gyroscope to evaluate the angular velocity on the X, Y, and Z-axes within a predetermined time. The magnitude of angular shift Gy, as recorded by the gyroscope, is given by Eq. (3.5).
Gy=∫ω dt (3.5)
For the gyroscope, MPU-6050 provided dimensionless readings. According to the datasheet, the numbers by
131 should be divided to obtain dps (degrees per second) and measurements, assuming a measuring range of ±250 per second. The researcher performed measurements throughout time to maintain track of the actual value deviation effect that the gyroscope sensor experiences. Due to the indefinite integral inherent errors, the value raised or falls during the stable position. This bias rate also referred to as deviation rate, which is important for calculating parameters.
The long-term reading of an accelerometer is quite precise; however,after sensor stabilization it produces a lot of noise throughout instant readings and motions. On the other hand, the gyroscope does have a good short-term value along with varying orientation, however, as a consequence of final angle measuring integration, it produced an error, which usually grows with time.
The combination of a gyroscope and an accelerometer coupled with respect to each other, is a technique for reducing deviations, obtaining more accurate readings, and reducing error-spreading calculations. Therefore, the researcher chose to utilize a complementary filter to accomplish this task.
The complementary filter includes an easy mathematical linear combination that assigned a value to every input parameter. In this scenario one to the gyroscope and one to the accelerometer. The total of these values must be one. Eqs. (3.5-3.7) represent mathematical relationships, which is employed by this filter.
θ=α.Gy+(1-α). Acc (3.6)
α=β/(β+dt) (3.7)
Gy=θ+ωdt (3.8)
Where α is a trust factor established by the smooth accelerometer and gyroscope function (usually set to 0.1), the sampling time was represented by dt, time-constant by β, ω showed the angular velocity in degrees per second detected by the gyroscope, and the angle in degrees obtained by the accelerometer.
The typical scale of accelerometer noise must be less than the time constant in order to achieve excellent readings. As illustrated in[39], the time constant was 1s and the sampling time noticed was 0.004 sec, yielding a result approximately equal to 0.96. It was noticed that the system is getting higher, more accurate and less noisy readings by using this alpha value.
Since this filter uses little computational power and is not mathematically precise, it can often be employed in smaller and low-processing systems [44]. As a result, the complementary filter is a better fit for this project.
The aim of this project is to make the motion capture system accessible for all. There are two phases for testing our designed prototype, Phase-I includes steps to set up the project. Phase II includes two tests first one real-time motion capture system and the second for IoT-based six DOFs tracking system.
Phase-I involves setting up before the implementation or testing the prototype. This part mainly covers the installation process.
The current study aims to set up a system for testing and connecting the project with the host computer to make sure the system is properly working before moving towards the testing process. The setting phase consisted of the following steps :
Phase II of testing is divided into two portions to test both sub-projects, the first portion covers the testing of Real-Time Motion Capture System using Unity 3D and the second covers the testing of the IoT based Six DOF tracking system.
To perform the testing process of the project and collect useful data for further calculations.
Oncethe system is ready to perform, the microcontroller is programed using Arduino IDE software, which is used to analyze the real-time motion capture system using a Unity 3D.
Once the code is uploaded on the microcontroller, click on the serial monitor to check the working of the microcontroller and close the serial monitor.
Open the Unity Project File in unity 3D, select the port in the Receiver.cs file for the communication of sensors and the skeleton and press the run button. Remember once you are working with the Unity 3D the Arduino IDE should be off or at least the serial monitor must be off.
FIGURE 13. Programming flow chart
As illustrated in Figure 13, two sensors were connected to a NodeMCU esp32 (Wi-Fi module), which were programmed using the Arduino IDE that supports C++ programming languages. It is up to the user to decide, which is the appropriate program to upload. If the user/ wish to capture the motion, simply upload the unity 3D If the user intends to program and play with the skeleton, he/she or to track the 6 DOFs of an object, simply upload the IoT program and use the Blynk app on your smartphone to see the results. The application of android devices is linked through programming to show the output, which can be shown on the referenced application while using app from any mobile device.
Programming for both real-time motion capture and IoT-based systems can be implemented by using encompassing techniques to execute the full project. The setup of the prototype as a whole and the placement of all sensors, was excellent. Both the IoT-based systems, 6 DoFs extracting systems, and the real-time motion capture systems were put through their paces by employing Unity 3D. The entire implementation process went smoothly and the results were achieved.
After the project completion, it was inspected and tested that the outcomes could have various implications. The values of various parameters selected on the Blynk application were discovered by using Internet of Things (IoT)-based system.
The entire prototype was setup in a self-made isolated room for testing, with no ventilation of air and only two pieces of electromechanical equipment were present: a laptop as a host computer (Dell with Intel
(R) Core (TM) i7-3612QM CPU @ 2.10GHz 2.10 GHz) and a smartphone (Oppo A1k) to note the results on the Blynk application. A tiny cable of microcontroller with a built-in Wi-Fi module was connected to both sensors. The microcontroller was responsive for both tracking, capturing, and transferring parameter data to the smartphone via cloud computing. As discussed above the project is categorized into two sub-projects; Real-Time Motion Capture and the IoT-based Six DoFs Tracking Systems. First, the results of the IoT based Six DoFs Tracking System, namely gyroscope and accelerometer were discussed. To make the calculation easier the researcher performed five experiments using a two-sample time interval refereed to one sensor. The sample rate was selected with the help of the time delay. Each sample of the collected data was gathered after a time delay of 100ms during the demonstration. To run a system or device properly, one had to calibrate it first, as it was done in this study. First, the system was calibrated so it is easily coordinated for displacement and velocity dimensions having tracked through the six DoFs. For calibration, the sensor was set to (0, 0, 0) as shown in Fig. 4.3.
Table II shows all the readings of the gyroscope and accelerometer for sensor-1 along the X-axis inthe calibration process. A total number of five experiments were performed. By using a sample time interval of two seconds, all the readings were observed on the Bylnk IoT mobile application. In a likewise manner, Table III and Table IV show the sensor-1 readings for the Y-axis and Z- axis, respectively.
offset bias can be calculated by:
Offset Bias=Experiment-Average of all Experiemts (5.1)
Here the displacement is described along the x axis. Specifically, to Move 2 inches along the X-axis and the new location of sensor 1 is (2inch, 0, 0) as shown in Fig. 4.4.
Table V is responsible to show the results of filtered and unfiltered with and without Offset Bias of the gyroscope of sensor-1 along X-axis and the second last column shows the running average error of gyroscope values. Table VI shows the readings of filtered and unfiltered with and without offset bias of accelerometer of sensor-1 along X-axis and the second last column shows the running average error of accelerometer values. Whereas for showing the Power of Noise (PN) in dBs
TABLE II
THE FILTERED, UNFILTERED AND OFFSET BIAS VALUES OF SENSOR 1 ALONG X- AXIS
Expr No. |
Unfiltered Readings of S1X Gyro |
Offset Bias |
Filtered Readings of S1X Gyro |
Offset Bias |
Unfiltered Readings of S1X Acc |
Offset Bias |
Filtered Readings of S1X Acc |
Offset Bias |
1 |
13.36 |
-4.012 |
12.57 |
-3.156 |
1.38 |
0.318 |
1.21 |
0.34 |
2 |
15.01 |
-2.362 |
14.67 |
-1.056 |
1.15 |
0.088 |
0.97 |
-0.1 |
3 |
17.81 |
0.438 |
15.78 |
0.054 |
1.01 |
-0.052 |
0,83 |
-0.04 |
4 |
19.57 |
2.198 |
17.32 |
1.594 |
0.93 |
-0.132 |
0.69 |
-0.18 |
5 |
21.11 |
3.738 |
18.29 |
2.564 |
0.84 |
-0.222 |
0.65 |
-0.22 |
TABLE III
THE FILTERED, UNFILTERED AND OFFSET BIAS VALUES OF SENSOR 1 ALONG Y-AXIS
Expr No. |
Unfiltered Readings of S1X Gyro |
Offset Bias |
Filtered Readings of S1X Gyro |
Offset Bias |
Unfiltered Readings of S1X Acc |
Offset Bias |
Filtered Readings of S1X Acc |
Offset Bias |
1 |
12.88 |
-3.484 |
11.74 |
-2.754 |
1.26 |
0.142 |
1.18 |
0.14 |
2 |
15.49 |
-0.874 |
13.21 |
-1.284 |
1.21 |
0.092 |
1.15 |
0.11 |
3 |
16.29 |
-0.074 |
14.17 |
-0.324 |
1.13 |
0.012 |
1.08 |
0.04 |
4 |
17.54 |
1.176 |
16.26 |
1.766 |
1.01 |
-0.108 |
0.92 |
-0.12 |
5 |
19.62 |
3.256 |
17.09 |
2.596 |
0,98 |
-0.138 |
0.87 |
-0.17 |
Expr No. |
Unfiltered Readings of S1X Gyro |
Offset Bias |
Filtered Readings of S1X Gyro |
Offset Bias |
Unfiltered Readings of S1X Acc |
Offset Bias |
Filtered Readings of S1X Acc |
Offset Bias |
1 |
13.22 |
-5.66 |
12.63 |
-4.37 |
1.36 |
0.314 |
1.26 |
0.332 |
2 |
16.83 |
-2.05 |
15.01 |
-1.99 |
1.19 |
0.144 |
1.05 |
0.122 |
3 |
18.61 |
-0.27 |
16.94 |
-0.06 |
1.03 |
-0.016 |
0.91 |
-0.018 |
4 |
21.99 |
3.11 |
19.48 |
2.48 |
0.91 |
-0.136 |
0.79 |
-0.138 |
5 |
23.75 |
4.87 |
20.94 |
3.94 |
0.74 |
-0.306 |
0.63 |
-0.298 |
TABLE IV
THE FILTERED, UNFILTERED AND OFFSET BIAS VALUES OF SENSOR 1 ALONG THE Z-AXIS
TABLE V
Expr No. |
Unfiltered Readings of S1X Gyro with Offset Bias |
Unfiltered Readings of S1X Gyro without Offset Bias |
Filtered Readings of S1X Gyro with offset Bias |
Filtered Readings of S1X Gyro without offset Bias |
Running Average Relative Error |
PN = -10.0118 dB |
1 |
14.97 |
18.982 |
13.88 |
17.036 |
0.76 |
|
2 |
15.43 |
17.792 |
14.27 |
15.326 |
0.95 |
|
3 |
17.37 |
16.932 |
16.21 |
16.156 |
0.12 |
|
4 |
18.05 |
15.852 |
17.89 |
16.296 |
0.02 |
|
5 |
20.11 |
16.372 |
19.13 |
16.566 |
0.29 |
ALL THE READINGS OF THE GYROSCOPE ALONG THE X-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG X-AXIS
TABLE VI
Expr No. |
Unfiltered Readings of S1X Acc with Offset Bias |
Unfiltered Readings of S1X Acc without Offset Bias |
Filtered Readings of S1X Acc with offset Bias |
Filtered Readings of S1X Acc without offset Bias |
Running Average Relative Error |
PN = -44.707 dB |
1 |
2.21 |
1.892 |
2.11 |
1.77 |
0.012 |
|
2 |
2.08 |
1.992 |
1.94 |
1.84 |
0.058 |
|
3 |
1.93 |
1.982 |
1.81 |
1.85 |
0.068 |
|
4 |
1.78 |
1.912 |
1.63 |
1.81 |
0.028 |
|
5 |
1.53 |
1.752 |
1.42 |
1.64 |
0.142 |
ALL THE READINGS OF THE ACCELEROMETER ALONG THE X-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG X-AXIS
TABLE VII
Expr No. |
Unfiltered Readings of S1X Acc with Offset Bias |
Unfiltered Readings of S1X Acc without Offset Bias |
Filtered Readings of S1X Acc with offset Bias |
Filtered Readings of S1X Acc without offset Bias |
Running Average Relative Error |
PN = -45.827 dB |
1 |
1.40 |
1.086 |
1.27 |
0.938 |
0.0504 |
|
2 |
1.29 |
1.146 |
1.18 |
1.058 |
0.0696 |
|
3 |
1.17 |
1.186 |
1.05 |
1.068 |
0.0796 |
|
4 |
1.03 |
1.166 |
0.86 |
0.998 |
0.0096 |
|
5 |
0.87 |
1.008 |
0.71 |
0.88 |
0.1084 |
ALL THE READINGS OF GYROSCOPE WHEN THE SENSOR-1 WAS PLACE AT (2INCH, 0, 0) ALONG Y-AXIS
Expr No. |
Unfiltered Readings of S1X Gyro with Offset Bias |
Unfiltered Readings of S1X Gyro without Offset Bias |
Filtered Readings of S1X Gyro with offset Bias |
Filtered Readings of S1X Gyro without offset Bias |
Running Average Relative Error |
PN = -1.577 dB |
1 |
13.26 |
18.92 |
12.19 |
16.56 |
1.7 |
|
2 |
15.17 |
17.22 |
13.00 |
14.99 |
0.13 |
|
3 |
16.57 |
16.84 |
14.07 |
14.13 |
0.73 |
|
4 |
18.87 |
15.76 |
16.54 |
14.06 |
0.8 |
|
5 |
20.81 |
15.94 |
18.50 |
14.56 |
0.3 |
TABLE VIII
ALL THE READINGS OF ACCELEROMETER Y-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG X-AXIS
TABLE IX
ALL THE READINGS OF GYROSCOPE WHEN THE SENSOR-1 WAS PLACE AT (2INCH, 0, 0) ALONG Z-AXIS
No. |
Unfiltered Readings of S1X Gyro with Offset Bias |
Unfiltered Readings of S1X Gyro without Offset Bias |
Filtered Readings of S1X Gyro with offset Bias |
Filtered Readings of S1X Gyro without offset Bias |
Running Average Relative Error |
PN = -13.14869 dB |
1 |
13.26 |
16.744 |
11.19 |
14.944 |
1.8 |
|
2 |
15.17 |
16.044 |
12.00 |
14.284 |
1.76 |
|
3 |
16.57 |
16.644 |
14.15 |
14.394 |
2.25 |
|
4 |
18.87 |
17.694 |
15.45 |
14.774 |
2.92 |
|
5 |
20.81 |
17.554 |
19.05 |
15.904 |
1.65 |
TABLE X
ALL THE READINGS OF ACCELEROMETER ALONG Z-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG X-AXIS
Expr No. |
Unfiltered Readings of S1X Acc with Offset Bias |
Unfiltered Readings of S1X Acc without Offset Bias |
Filtered Readings of S1X Acc with offset Bias |
Filtered Readings of S1X Acc without offset Bias |
Running Average Relative Error |
PN = -35.081 dB |
1 |
1.51 |
1.368 |
1.38 |
1.24 |
0.202 |
|
2 |
1.36 |
1.268 |
1.25 |
1.14 |
0.102 |
|
3 |
1.15 |
1.138 |
1.03 |
0.99 |
0.048 |
|
4 |
0.94 |
1.048 |
0.82 |
0.94 |
0.098 |
|
5 |
0.87 |
1.008 |
0.71 |
0.88 |
0.158 |
TABLE XI
ALL THE READINGS OF GYROSCOPE WHEN THE SENSOR-1 WAS PLACE AT (0, 2INCH, 0) ALONG X-AXIS.
No. |
Unfiltered Readings of S1X Gyro with Offset Bias |
Unfiltered Readings of S1X Gyro without Offset Bias |
Filtered Readings of S1X Gyro with offset Bias |
Filtered Readings of S1X Gyro without offset Bias |
Running Average Relative Error |
PN = -13.14869 dB |
1 |
13.26 |
16.744 |
11.19 |
14.944 |
1.8 |
|
2 |
15.17 |
16.044 |
12.00 |
14.284 |
1.76 |
|
3 |
16.57 |
16.644 |
14.15 |
14.394 |
2.25 |
|
4 |
18.87 |
17.694 |
15.45 |
14.774 |
2.92 |
|
5 |
20.81 |
17.554 |
19.05 |
15.904 |
1.65 |
TABLE XII
ALL THE READINGS OF ACCELEROMETER ALONG X-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG Y-AXIS
Expr No. |
Unfiltered Readings of S1X Acc with Offset Bias |
Unfiltered Readings of S1X Acc without Offset Bias |
Filtered Readings of S1X Acc with offset Bias |
Filtered Readings of S1X Acc without offset Bias |
Running Average Relative Error |
PN = -31.164 dB |
1 |
2.37 |
2.052 |
2.26 |
1.92 |
0.022 |
|
2 |
2.22 |
2.132 |
2.13 |
2.23 |
0.288 |
|
3 |
2.04 |
2.092 |
1.90 |
1.94 |
0.002 |
|
4 |
1.85 |
1.982 |
1.73 |
1.91 |
0.032 |
|
5 |
1.69 |
1.912 |
1.49 |
1.71 |
0.232 |
TABLE XIII
ALL THE READINGS OF GYROSCOPE WHEN THE SENSOR-1 WAS PLACE AT (0, 2INCH, 0) ALONG Y-AXIS
ALL THE READINGS OF ACCELEROMETER WHEN THE SENSOR-1 WAS PLACE AT (0, 2INCH, 0) ALONG Y-AXIS.
Expr No. |
Unfiltered Readings of S1X Acc with Offset Bias |
Unfiltered Readings of S1X Acc without Offset Bias |
Filtered Readings of S1X Acc with offset Bias |
Filtered Readings of S1X Acc without offset Bias |
Unfiltered Readings of S1X Acc with Offset Bias |
PN = -38.433 dB |
1 |
2.01 |
1.696 |
1.90 |
1.568 |
0.028 |
|
2 |
1.92 |
1.776 |
1.75 |
1.628 |
0.088 |
|
3 |
1.85 |
1.866 |
1.60 |
1.618 |
0.078 |
|
4 |
1.53 |
1.666 |
1.42 |
1.558 |
0.018 |
|
5 |
1.22 |
1.526 |
1.03 |
1.328 |
0.212 |
Expr No. |
Unfiltered Readings of S1X Gyro with Offset Bias |
Unfiltered Readings of S1X Gyro without Offset Bias |
Filtered Readings of S1X Gyro with offset Bias |
Filtered Readings of S1X Gyro without offset Bias |
Running Average Relative Error |
PN = -1.834 dB |
1 |
15.04 |
20.7 |
14.82 |
19.19 |
1.68 |
|
2 |
16.45 |
18.5 |
15.34 |
17.33 |
0.18 |
|
3 |
17.81 |
18.08 |
17.11 |
17.17 |
0.34 |
|
4 |
20.00 |
16.89 |
19.86 |
17.38 |
0.13 |
|
5 |
22.04 |
17.17 |
20.42 |
16.48 |
1.03 |
TABLE XV
Expr No. |
Unfiltered Readings of S1X Acc with Offset Bias |
Unfiltered Readings of S1X Acc without Offset Bias |
Filtered Readings of S1X Acc with offset Bias |
Filtered Readings of S1X Acc without offset Bias |
Running Average Relative Error |
PN = -39.286 dB |
1 |
1.43 |
1.288 |
1.32 |
1.18 |
0.108 |
|
2 |
1.33 |
1.238 |
1.27 |
1.16 |
0.088 |
|
3 |
1.24 |
1.228 |
1.15 |
1.11 |
0.038 |
|
4 |
1.01 |
1.118 |
0.89 |
1.01 |
0.062 |
|
5 |
0.84 |
0.978 |
0.73 |
0.9 |
0.172 |
ALL THE READINGS OF GYROSCOPE ALONG Z-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG Y-AXIS.
TABLE XVI
ALL THE READINGS OF ACCELEROMETER ALONG Z-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG Y-AXIS
Expr No. |
Unfiltered Readings of S1X Gyro with Offset Bias |
Unfiltered Readings of S1X Gyro without Offset Bias |
Filtered Readings of S1X Gyro with offset Bias |
Filtered Readings of S1X Gyro without offset Bias |
Running Average Relative Error |
PN = -12.529 dB |
1 |
13.26 |
16.744 |
12.09 |
14.844 |
0.286 |
|
2 |
14.71 |
15.584 |
13.40 |
14.684 |
0.446 |
|
3 |
16.32 |
16.394 |
15.27 |
15.594 |
0.464 |
|
4 |
17.48 |
16.304 |
16.46 |
14.694 |
0.436 |
|
5 |
19.16 |
15.904 |
18.43 |
15.834 |
0.704 |
TABLE XVII
Expr No. |
Unfiltered Readings of S1X Gyro with Offset Bias |
Unfiltered Readings of S1X Gyro without Offset Bias |
Filtered Readings of S1X Gyro with offset Bias |
Filtered Readings of S1X Gyro without offset Bias |
Running Average Relative Error |
PN = -9.938 dB |
1 |
13.76 |
17.772 |
12.85 |
16.006 |
1.046 |
|
2 |
14.38 |
16.742 |
13.71 |
14.766 |
0.194 |
|
3 |
15.81 |
15.372 |
15.02 |
14.966 |
0.006 |
|
4 |
16.92 |
14.722 |
15.91 |
14.316 |
0.644 |
|
5 |
18.04 |
14.302 |
17.31 |
14.746 |
0.214 |
ALL THE READINGS OF GYROSCOPE WHEN THE SENSOR-1 WAS PLACE AT (0, 0, 2INCH) ALONG X-AXIS
TABLE XVIII
ALL THE READINGS OF ACCELEROMETER ALONG X-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG Z-AXIS
Expr No. |
Unfiltered Readings of S1X Acc with Offset Bias |
Unfiltered Readings of S1X Acc without Offset Bias |
Filtered Readings of S1X Acc with offset Bias |
Filtered Readings of S1X Acc without offset Bias |
Running Average Relative Error |
PN = -35.342 dB |
1 |
1.85 |
1.532 |
1.71 |
1.37 |
0.118 |
|
2 |
1.72 |
1.632 |
1.62 |
1.72 |
0.232 |
|
3 |
1.60 |
1.652 |
1.48 |
1.52 |
0.032 |
|
4 |
1.42 |
1.552 |
1.29 |
1.47 |
0.018 |
|
5 |
1.29 |
1.512 |
1.14 |
1.36 |
0.128 |
TABLE XIX
ALL THE READINGS OF GYROSCOPE WHEN THE SENSOR-1 WAS PLACE AT (0, 0, 2INCH) ALONG Y-AXIS
ALL THE READINGS OF ACCELEROMETER WHEN THE SENSOR-1 WAS PLACE AT (0, 0, 2INCH) ALONG Y-AXIS.
Expr No. |
Unfiltered Readings of S1X Acc with Offset Bias |
Unfiltered Readings of S1X Acc without Offset Bias |
Filtered Readings of S1X Acc with offset Bias |
Filtered Readings of S1X Acc without offset Bias |
Running Average Relative Error |
PN = -54.226 dB |
1 |
1.32 |
1.006 |
1.27 |
0.938 |
0.044 |
|
2 |
1.22 |
1.076 |
1.17 |
1.048 |
0.066 |
|
3 |
1.08 |
1.096 |
1.00 |
1.018 |
0.036 |
|
4 |
0.95 |
1.086 |
0.80 |
0.938 |
0.044 |
|
5 |
0.79 |
1.096 |
0.67 |
0.968 |
0.014 |
Expr No. |
Unfiltered Readings of S1X Gyro with Offset Bias |
Unfiltered Readings of S1X Gyro without Offset Bias |
Filtered Readings of S1X Gyro with offset Bias |
Filtered Readings of S1X Gyro without offset Bias |
Running Average Relative Error |
PN = -8.166 dB |
1 |
12.17 |
17.83 |
11.72 |
16.09 |
1.13 |
|
2 |
14.12 |
16.17 |
13.00 |
14.99 |
0.03 |
|
3 |
15.41 |
15.68 |
14.76 |
14.82 |
0.14 |
|
4 |
17.02 |
13.91 |
16.67 |
14.19 |
0.77 |
|
5 |
19.32 |
14.45 |
18.65 |
14.71 |
0.25 |
TABLE XXI
ALL THE READINGS OF GYROSCOPE ALONG Z-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG Z-AXIS
TABLE XXII
ALL THE READINGS OF THE ACCELEROMETER ALONG THE Z-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG Z-AXIS
Expr No. |
Unfiltered Readings of S1X Gyro with Offset Bias |
Unfiltered Readings of S1X Gyro without Offset Bias |
Filtered Readings of S1X Gyro with offset Bias |
Filtered Readings of S1X Gyro without offset Bias |
Running Average Relative Error |
PN = -8.834 dB |
1 |
14.61 |
18.094 |
13.19 |
15.944 |
0.096 |
|
2 |
15.53 |
16.404 |
14.00 |
15.284 |
0.564 |
|
3 |
16.24 |
16.314 |
14.98 |
15.304 |
0.544 |
|
4 |
18.67 |
17.494 |
17.54 |
15.774 |
0.074 |
|
5 |
21.18 |
17.924 |
19.53 |
16.934 |
1.086 |
Expr No. |
Unfiltered Readings of S1X Acc with Offset Bias |
Unfiltered Readings of S1X Acc without Offset Bias |
Filtered Readings of S1X Acc with offset Bias |
Filtered Readings of S1X Acc without offset Bias |
Running Average Relative Error |
PN = -40.189 dB |
1 |
1.30 |
1.158 |
1.19 |
1.05 |
0.144 |
|
2 |
1.17 |
1.078 |
1.07 |
0.96 |
0.054 |
|
3 |
1.04 |
1.028 |
0.93 |
0.89 |
0.016 |
|
4 |
0.89 |
0.998 |
0.76 |
0.88 |
0.026 |
|
5 |
0.76 |
0.898 |
0.58 |
0.75 |
0.156 |
Eq. (5.2) helped to extract the offset bias from filtered and unfiltered values of gyroscope and acceleration.
Without Offset Bias=With Offset Bias-Offset Bias (5.2)
Eq. (5.3) helped in calculating the running average error for both acceleration.
Running Average Relative Error=| Filtered witout Bias Value-Average of all Filtered witout Bias Value| (5.3)
To calculate the Power of Noise, Eq. (5.4) was used,
PN(Power of Noise)=√(∑in(Running Average Relative Error)2)/n (Number of Experiments) (5.4)
Eq. (5.5) converted the Power of Noise (PN) in decibel:
PN(dB) =20 log(PN) (5.5)
Table VII is responsible to show the results of filtered and unfiltered with and without offset bias of gyroscope of sensor-1 along the Y-axis and the column-7 shows the running average error of gyroscope values. Table VIII shows the readings of filtered and unfiltered with and without offset bias of acceleration of sensor-1 along the Y-axis and the last column shows the running average error of accelerometer values.
Table IX shows results of filtered and unfiltered with and without offset bias of gyroscope of sensor-1 along the Z-axis and the last column shows the running average error of gyroscope values. Table X shows the readings of filtered and unfiltered with and without offset bias of acceleration of sensor-1 along the Z-axis and the column-7 shows the running average error of accelerometer values.
Consequently displacing 2 inches along Y-axis analysis can be performed on the results. The new location of the sensor-1 would be (0, 2inch, 0) as shown in Fig. 4.6.
Table XI is responsible to show the results of filtered and unfiltered with and without Offset Bias of gyroscope of sensor-1 along the X-axis. Table XIIshows the readings of filtered and unfiltered with and without Offset Bias of acceleration of sensor-1 along the X-axis. Whereas column-7 Table XI to Table XXII shows the running average error of gyroscope readings and accelerometer values.
Table XIII is responsible to show the results of filtered and unfiltered with and without offset bias of gyroscope of sensor-1 along the Y-axis. Table XIV shows the readings of filtered and unfiltered with and without offset bias of acceleration of sensor-1 along the Y-axis.
To track values of all six DoFs, we perform the same producer for Z-axis by moving sensor-1 2 inches in the direction of Z-axis, and analysis the results. The new location of the sensor-1 became (0, 0, 2inch) as shown in Fig. 4.5.
Table XVII is responsible to show the results of Filtered and Unfiltered with and without Offset Bias of Gyroscope of sensor-1 along X-axis. Table XVIII shows the readings of Filtered and Unfiltered with and without Offset Bias of Acceleration of sensor-1 along X-axis.
Table XIX is responsible to show the results of Filtered and Unfiltered with and without Offset Bias of Gyroscope of sensor-1 along Y-axis. Table XX shows the readings of Filtered and Unfiltered with and without Offset Bias of Acceleration of sensor-1 along Y-axis.
Table XXI is responsible to show the results of Filtered and Unfiltered with and without Offset Bias of Gyroscope of sensor-1 along Z-axis. Table XXII shows the readings of Filtered and Unfiltered with and without Offset Bias of Acceleration of sensor-1 along Z-axis.
A real-time motion capture system is the main part of the proposed prototype. The mocap system includes configuration of the system using user fed parameters including dimensions and environment. The program is for real-time motion capture. The sensor is mated with the skeleton on Unity 3D, as the sensor would detect any action or movement the sensor the skeleton would move automatically, it depend on the user what he/she wants to control/move with the sensors. In this prototype, the right arm was controlled through a motion sensor (both the upper and lower). As indicated in Section 4.1.1 the system was programed using Arduino IDE software to capture the motion. After 10-15 seconds, the Unity program would be designed as Unity 3D skeleton.
FIGURE 14. Unity 3D Software Interface with Designed Skeleton
FIGURE 15. Running Project on Unity 3D
FIGURE 16. Testing the Real-Time MoCap System
Every mocap vendor believes that its solution must be the first choice by studios and developers equally, whether on a large as well as small scale. Comparing this prototype with the available vendors like Xsens and Rokoko. Viz Guru a YouTuber shared his reviews regarding different uses of mocap suits, this video is also available on Xsens Technologies official website, Any interested user can demonstrate in live prototype by scanning the QR code available separately.
It was observed that the Rokoko suit does not have level support, so it cannot climb stairs or anywhere else because your feet are practically tethered to the ground, whereas the Xsens and this prototype have level support, so it can be used to jump and to walk upstairs.
Secondly, Rokoko gave a full-body suit with built-in sensors; With respect to enabling the users ,it is much faster, but it is general brought to use . While both the Xsens and the proposed system feature individual sensors, which can be mounted on an object or human body. This system takes some time to set up but it can be /utilized by all sizes and shaped bodies.
Furthermore, Mr. Guru elaborated that with Xsens users had to pay some extra money for Xsens Studio. While Rokoko provided its users free Rokoko Studio with its suit, compared with our system it uses easily available software like Unity 3D and Blender in our case we used Unity 3D.
The results of the suggested prototype, from calibration results to the values of six DoFs. Moreover, Table II, Table III, and Table IV show all the readings of gyroscope and accelerometer for sensor-1, . Here calibration along X-axis, Y-axis, and Z-axis, respectively imply that calibration along all three axis sensors is possible. Table V and other tables are responsible to show the filtered and unfiltered results with and without offset bias of gyroscope and accelerometer of sensor-1 along X-axis, Y- axis, and Z-axis and the second last column shows the running average error of gyroscope and accelerometer values. Whereas the last column in these tables shows the Power of Noise (PN) in dBs. The PN column shows the stability and reliability of the proposed system. The gyroscope values and the RMS percentage value of PN values were 16.194, 4.917 and 11.660 along X-axis, Y- axis, and Z-axis respectively, all these values show that much of the noise was remove from the recoded values. In the case of acceleration these values were 37.500, 46.610, and 38.250. Furthermore, it described the configuration process to capture the real-time motion of an object. Lastly, the proposed prototype was compared with the available systems in the market.
The current research described the development of a real-time IMU-based mocap system to enhance the flexibility and usability of a serial- chain network, which is proposed to record orientation data from various IMU sensors. Thereby, the results of the experiments, the designed prototype, which comprises two IMU sensors, were capable of gathering orientation data from a reasonably spaced object.
Motion capture (mo-cap or mocap) is a technique, which refers to the motion of an object or human. It has innumerable applications in different fields, such as entertainment, military, medical, sports, robot validation [45] and computer vision [46]. Primarily, it is used in video games and filmmaking production to capture and record human actions to animate digital character models in 3D or 2D on the host computer [47-49]. It is often described as performance capture [50] when it involves the fingers and face or captures delicate expressions. Motion capture is sometimes referred to as motion tracking in numerous fields. It is repeated to familiarize the user that the impact of this technology exists in todays world. The motions of one or even more performers are sampled numerous times per second during motion capture events. This can be seen implemented in the various simulation techniques that are discussed in this paper. Unlike early systems, which used photographs from numerous cameras to determine 3D positions [50], the goal of motion capture is frequently to record just the actor's actions, rather than their physical appearance. This has implication for improving the efficiency of visual arts so talent and creativitiy can brought in the firefront. This animated data is then transferred to a 3D model, which then executes the identical movements as the actor. This method can be compared with the traditional rotoscoping approach.
The IMU sensor-based motion capture solution, which was built for human arm motion capture using reverse kinematicsThe arm's (upper and lower) orientation would be tracked by the IMU sensor. The sensors calibration was required in this application to maximize the tracking performance and accuracy of the motion capture. This paper other contribution also included a basic calibration procedure. The prototype was divided into two sub-projects: one for the IoT-based tracking of six degrees of freedom and transmitting data to the cloud, which can be accessed from anywhere on the planet and the other for the motion capture system, which can be used to capture the movement of any object. This can be translated into the action of a computer-generated 3D character on screen.
The MoCap system introduced in this study has several advantages over other motion capturing /techniques, which are as follows:
Performance can be achieved with low latency, near to real-time. This can lower the expenses of key frame-based animation for entertainment purposes.
The amount of effort does not change much with the length or the complexity of the act with traditional techniques. This enables multiple tests to be performed with varied techniques or deliveries, resulting in various personality, which is only bound by the talent of the actor.
While comparing with other traditional animation approaches, a huge number of animated data can significantly be created in a given period. This helps to achieve the production deadlines and cost- effectiveness.
This costs could be reduced by using open source software and third- party services.
This report describes the development of a real-time IMU-based mocap system. To enhance flexibility and usability, a serial- chain network is proposed to record orientation data from various IMU sensors. Based on the results of the experiments, the designed prototype, which comprises two IMU sensors, is capable of gathering orientation data from an object.
Table II to Table X shows the values of accelerometer and gyroscope on different positions with and without offset bias for better understanding of noise. Lets go through these tables and analysis how this prototype can be improved. After analyzing all the readings it can be concluded that filter has performed its duty perfectly on accelerometer values and make it near to actual value. Noise along Z-axis for accelerometer values is more than along X- axis and Y-axis. While gyroscope values are also filtered superbly, the recoded values shows the noise along Y-axis for gyroscope values is less than along X-axis and Y-axis [51]. The noise filtration for whole system (both the Gyroscope and Accelerometer) is about 17.237 percent.
Based on mocap market analysis, the annual growth rate of 9.8 percent is expected between 2018 and 2023. However, this figure has been revised to 12.12 percent for the period 2019-2024. In fact, motion capture is anticipated to be a $252.8 million industry in five years. Furthermore, because the industry is growing at such a rapid pace, it is pushing the creation of new innovations and advancements such as marker-less that implies that ping-pong balls will no longer be required. Higher order DoF have a positive impact on mocap performance In addition, mocap manufacturers are attempting to make the equipment more portable and affordable.
[1] W.-H. Tan, W.-J. Li, Y.-Z. Zheng, and X.-C. Zhou, “ePet: A physical game based on wireless sensor networks,” Int. J. Distrib. Sens. Netw., vol. 5, no. 1, p. 68. Jan. 2009, doi: http://dx.doi.org/10.1080/15501320802555262
[2] L. BO, “The development and design of motion capture system based on sensors,” M.S. thesis, Beijing Insti. Technol., Beijing, China, 2011.
[3] Z. Wang, C. Zhao, and S. Qiu, “A system of human vital signs monitoring and activity recognition based on body sensor network,” Sens. Rev., vol. 34, no. 1, pp. 42–50, Jan. 2014, doi: https://doi.org/10.1108/SR-12-2012-735
[4] K. D. Nguyen, I-M. Chen, Z. Luo, S.H. Yeo, and H. B. L. Duh, “A wearable sensing system for tracking and monitoring of functional arm movement,” IEEE/ASME Trans. Mechatron., vol. 16, pp. 213–220, Apr. 2011, doi: http://dx.doi.org/10.1109/TMECH.2009.2039222
[5] P.-Z. Chen, J. Li, M. Luo, and N.-h. Zhu, "Real-Time human motion capture driven by a wireless sensor network,” Int. J. Comput. Games Technol., vol. 2015, Art. no. 695874, Feb. 2015, doi: https://doi.org/10.1155/2015/695874
[6] Z. Wang, S. Qiu, Z. Cao, and M. Jiang, “Quantitative assessment of dual gait analysis based on inertial sensors with body sensor network,” Sensor Rev., vol. 33, no. 1, pp. 48–56, Jan. 2013, doi: http://dx.doi.org/10.1108/02602281311294342
[7] B. Xu, “The design and implementation of a motion capture system based on MEMS sensors and Zigbee network,” M. S. thesis, Univ. Elec. Sci. Technol., 2013.
[8] K. Motta-Valencia, “Dance-Related injury,” Phys. Med. Rehabil. Clin. N. Am., vol. 17, no. 3, pp. 697–723, Aug. 2006, doi: https://doi.org/10.1016/j.pmr.2006.06.001
[9] C. E. Hiller, K. M. Refshauge, and D. Beard, “Sensory motor control is impaired in dancers with functional ankle instability,” Am. J. Sports Med., vol. 32, no. 1, pp. 216–223, Jan. 2004, https://doi.org/10.1177/0363546503258887
[10] F. Thomas and O. Johnston, The illusion of life: Disney animation. Disney, 1981.
[11] H. Beckerman, Animation the whole story. New York, Allworth Press, 2012.
[12] J. D. B. Stillman and E. Muybridge, The horse in motion: As shown by instantaneous photography with a study on animal mechanics. Boston, Osgood and Co. 1882.
[13] E.-J. Marey, Animal mechanism: A treatise on terrestrial and aerial locomotion. New York, Appleton and Co. 2016.
[14] C. Webster, Animation: The mechanics of motion. Oxford, Burlington, Ma. Elsevier Focal Press, 2005.
[15] M. Fleischer, Method of producing moving picture cartoons. United States of America Patent Appl., Oct. 1917. https://patentimages.storage.googleapis.com/38/02/89/60f7fdee74fa55/US1242674.pdf
[16] A. Menache, Understanding motion capture for computer animation and video games. San Diego, Ca. Morgan Kaufmann, 2000.
[17] J. K. Waters, Blobitecture: Waveform architecture and digital design. Gloucester, Mass. Rockport Publishers, 2003.
[18] J. F. Orr and J. C. Shelton, Optical measurement methods in biomechanics. New York, Chapman & Hall, 1997.
[19] S. Corazza, L. Mündermann, A. M. Chaudhari, T. Demattio, C. Cobelli, and Andriacchi, T. P. “A markerless motion capture system to study musculoskeletal biomechanics: Visual hull and simulated annealing approach,” Ann. Biomed. Eng., vol. 34, pp. 1019–1029, May, 2006, https://doi.org/10.1007/s10439-006-9122-8
[20] Y. Ohgi, “MEMS sensor application for the motion analysis in sports science,” ABCM Symp. Series Mechat., vol. 2, pp. 501–508, 2006.
[21] R. Okada, B. Stenger, T. Ike, and N. Kondoh, “Virtual fashion show using realtime markerless motion capture,” Asian Conf. Comput. Vis., Berlin, Heidelberg: Springer-Verlag, Jan. 13–16, 2006, pp. 801–810, doi: https://doi.org/10.1007/11612704_80
[22] N. Kiryati, T. R. Raviv, Y. Ivanchenko, and S. Rochel, “Real-time abnormal motion detection in surveillance video,” 19th Int. Conf. Pattern Recog, 2008, pp. 1–4. https://doi.org/10.1109/ICPR.2008.476113
[23] R. Ma, D. Chablat, F. Bennis, and L. Ma, “A framework of motion capture system based human behaviours simulation for ergonomic analysis,” Proc. Int. Conf. HCI Int., Orlando, FL, USA, July 9–14, 2011, pp. 360–364, doi; https://doi.org/10.1007/978-3-642-22095-1_73
[24] B. Rosenhahn, R. Klette, and D. Metaxas, Human motion: Understanding, modelling, capture and animation. Dordrecht: Springer, 2010.
[25] M. O'rourke, Principles of three-dimensional computer animation: modelling, rendering, and animating with 3D computer graphics. New York, Norton and Co, 2003.
[26] R. Parent, “Computer animation-algorithms & techniques. Waltham, MA: Morgan Kaufmann, 2008.
[27] W. Takano, Y. Murakami, and Y. Nakamura, “Representation and classification of whole-body motion integrated with finger motion,” Rob Auton Syst., vol. 124, Art. no. 103378, Feb. 2020, doi: https://doi.org/10.1016/j.robot.2019.103378
[28] W. S. Meador, T. J. Rogers, K. O’Neal, E. Kurt, and C. Cunningham, “Mixing dance realities: Collaborative development of live-motion capture in a performing arts environment,” Comput. Entertain., 2004, vol. 2, no. 2, pp. 12–12, doi: https://doi.org/10.1145/1008213.1008233
[29] J. Haag, “Inertial motion capture and live performance (with a Focus on Dance),” Dance Dialog: Conversations across cultures, artforms and practices, Queensland University of Technology, Brisbane, Queensland, July 13–18, 2009.
[30] A. Andreadis, A. Hemery, A. Antonakakis, G. Gourdoglou, P. Mauridis, D. Christopoulos, and J. N. Karigiannis, “Real-time motioncapture technology on a live theatrical performance with computergenerated scenery,” Proc. PCI, Tripoli, Greece, 2010, pp. 148-152.
[31] J. V. D. Linden, E. Schoonderwaldt, J. Bird, and R. Johnson, “MusicJacket - Combining motion capture and vibrotactile feedback to teach violin bowing,” IEEE Trans. Instrument. Measur., vol. 60, no. 1, pp. 104 –113, Sep. 2011, doi: https://doi.org/10.1109/TIM.2010.2065770
[32] A. L. Aguinaldo and H. Chamber, “Correlation of throwing mechanics with elbow valgus load in adult baseball pitchers,” Am. J. Spor. Med., vol. 37, no. 10, pp. 2043–2048, July 2009, doi: https://doi.org/10.1177/0363546509336721
[33] G. D. Myer, K. R. Ford, S. G. Mclean, and T. E. Hewett, “The effects of plyometric versus dynamic stabilization and balance training on lower extremity biomechanics”, Am. J. Sport. Med., vol. 34, no. 3, pp. 445–455, Mar. 2006, doi: https://doi.org/10.1177/0363546505281241
[34] S. G. Mclean, A. Su, A., and A. L. V. D. Bogert, “Development and validation of a 3-D model to predict knee joint loading during dynamic movement,” J. Biomech. Eng., vol. 125, pp. 864–874, 2003, doi: https://doi.org/10.1115/1.1634282
[35] G. Welch and E. Foxlin, “Motion tracking: No silver bullet, but a respectable arsenal,” IEEE Comput. Graph. Applic., vol. 22, no. 6, pp. 24–38, 2002, doi: https://doi.org/10.1109/MCG.2002.1046626
[36] N. Abbate, A. Basile, C. Brigante, and A. Faulisi, “Development of a MEMS based wearable motion capture system. Conference on Human System Interactions,” 2nd Conference on Human System Interactions, Catania, Italy, May 21–23, 2009, pp. 255–259, doi: https://doi.org/10.1109/HSI.2009.5090988
[37] Inven Sense. “MPU-6000 and MPU-6050 Product Specification Revision 3.4.” https://invensense.tdk.com/wp-content/uploads/2015/02/MPU-6000-Datasheet1.pdf (Aug. 19, 2013).
[38] P. Narkhede, S. Poddar, R. Walambe, G. Ghinea, and K. Kotecha, “Cascaded complementary filter architecture for sensor fusion in attitude estimation,” Sensors, vol. 21, no. 6, Art. no. 1937, Mar. 2021, doi: https://doi.org/10.3390/s21061937
[39] D. Ibrahim, Microcontroller based applied digital control. Hoboken, Nj: John Wiley, 2006.
[40] V. N. Lage, A. K. R. Segundo, and T. V. B. e Pinto, “Mathematical modelling of a two degree of freedom platform using accelerometers and gyro sensors,” J. Mech. Eng. Autom., vol. 6, no. 8, pp. 427–433, 2016, doi: https://doi.org/10.17265/2159-5275/2016.08.006
[41] K. Y. Tong, A. F. T. Mak, and W. Y. Ip, “Command control for functional electrical stimulation hand grasp systems using miniature accelerometers and gyroscopes,” Med. Biol. Eng. Comput., vol. 41, pp. 710–717, Nov. 2003, doi: https://doi.org/10.1007/BF02349979
[42] J. Fei, W. Dai, M. Hua, and Y. Xue, “System dynamics and adaptive control of mems gyroscope sensor,” IFAC Proc. 2011, Vol. 44, pp. 3551–3556, doi: https://doi.org/10.3390/s17112663
[43] C. Coopmans, A. M. Jensen, and Y. Q. Chen, “Fractional-Order complementary filters for small unmanned aerial system navigation,” J. Intell. Robot Syst., vol. 73, pp. 429–453, Oct. 2013, doi: https://doi.org/10.1007/s10846-013-9915-6
[44] K. Yamane and J. Hodgins, “Simultaneous tracking and balancing of humanoid robots for imitating human motion capture data,” IEEE/RSJ Int. Conf. Intell. Robot. Syst., St. Louis, MO, USA, Oct. 10–15, 2009, doi: https://doi.org/10.1109/IROS.2009.5354750
[45] D. Noonan, P. Mountney, D. Elson, A. Darzi, and G.-Z. Yang, “A stereoscopic fibroscope for camera motion and 3-d depth recovery during minimally invasive surgery,” IEEE Int. Conf. Robo. Automat., Kobe, Japan, May 12–17, 2009, pp. 4463–4468, doi: https://doi.org/10.1109/ROBOT.2009.5152698
[46] I. Rapp. “Motion capture actors: Body Movement Tells the story.” NY Castings – Direct Submit. https://www.nycastings.com/motion-capture-actors-body-movement-tells-the-story/ (accessed Jan. 19, 2023).
[47] A. H. Salomon. “Growth in performance capture helping gaming actors weather slump.” Backstage.com. https://www.backstage.com/magazine/article/growth-performance-capture-helping-gaming-actors-weather-slump-47881/ (accessed Jan. 19, 2023).
[48] B. Child. “Andy serkis: why won’t Oscars go ape over motion-capture acting?” Theguardian.com. https://www.theguardian.com/film/2011/aug/12/andy-serkis-motion-capture-acting (accessed Jan 19, 2023).
[49] H. Hart. “Wired magazine, When will a motion capture actor win an Oscar?” Wired.com. https://www.wired.com/2012/01/andy-serkis-oscars/ (accessed Jan 19, 2023).
[50] G. K. M. Cheung, T. Kanade, J.-Y. Bouguet, and M. Holler, “A real time system for robust 3D voxel reconstruction of human motions,” Proc. IEEE Conf. Comput. Vision Pattern Recog., CVPR 2000.
[51] Ameri Research. “3D motion capture market size, analysis.” https://www.ameriresearch.com/product-tag/3d-motion-capture-market/ (accessed Jan. 19, 2023).
[52] Optitrack. “Motion capture suits.” https://optitrack.com/accessories/wear/. (accessed Jan 19, 2023).
[53] Xsens. “Refreshed pricing model for Xsens’ motion capture solution.” Available: https://www.xsens.com/news/xsens-motion-capture-pricing (accessed Jan 19, 2023).
[54] Rokoko. “Quality motion capture in one simple suit.” https://www.rokoko.com/products/smartsuit-pro (accessed Jan 19, 2023).
[55] S. Hayden. “Noitom releases Perception Neuron 2.0 motion capture system.” Road to VR. https://www.roadtovr.com/noitom-releases-perception-neuron-2-0-motion-capture-system/ (accessed Jan. 19, 2023).