<?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE article PUBLIC "-//NLM/DTD JATS (Z39.96) Journal Publishing DTD v1.2 20120330//EN" "http://jats.nlm.nih.gov/publishing/1.2/JATS-journalpublishing1.dtd">
    <!--<?xml-stylesheet type="text/xsl" href="article.xsl">-->
<article xmlns:ns0="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="research-article" dtd-version="1.2" xml:lang="en">
	<front>
		<journal-meta>
			<journal-id journal-id-type="eissn">3034-1558</journal-id>
			<journal-title-group>
				<journal-title>Cifra. Information technology and telecommunications</journal-title>
			</journal-title-group>
			<publisher>
				<publisher-name>Cifra LLC</publisher-name>
			</publisher>
		</journal-meta>
		<article-meta>
			<article-id pub-id-type="doi">10.60797/itech.2025.7.7</article-id>
			<article-categories>
				<subj-group>
					<subject>Brief communication</subject>
				</subj-group>
			</article-categories>
			<title-group>
				<article-title>MECANUM-WHEELED ROBOT WITH AI-DRIVEN MANIPULATION</article-title>
			</title-group>
			<contrib-group>
				<contrib contrib-type="author" corresp="yes">
					<contrib-id contrib-id-type="orcid">https://orcid.org/0000-0003-1906-9932</contrib-id>
					<name>
						<surname>Mohapatra</surname>
						<given-names>Badri Narayan</given-names>
					</name>
					<email>badri1.mohapatra@gmail.com</email>
					<xref ref-type="aff" rid="aff-3">3</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Vacche</surname>
						<given-names>Sakshat</given-names>
					</name>
					<email>sakshatvacche8@gmail.com</email>
					<xref ref-type="aff" rid="aff-1">1</xref>
				</contrib>
				<contrib contrib-type="author">
					<name>
						<surname>Nalge</surname>
						<given-names>Snehal</given-names>
					</name>
					<email>snehalnalge246@gmail.com</email>
					<xref ref-type="aff" rid="aff-2">2</xref>
				</contrib>
			</contrib-group>
			<aff id="aff-1">
				<label>1</label>
				<institution>All India Shri Shivaji Memorial Society's Institute of Information Technology</institution>
			</aff>
			<aff id="aff-2">
				<label>2</label>
				<institution>All India Shri Shivaji Memorial Society's Institute of Information Technology</institution>
			</aff>
			<aff id="aff-3">
				<label>3</label>
				<institution>All India Shri Shivaji Memorial Society's Institute of Information Technology</institution>
			</aff>
			<pub-date publication-format="electronic" date-type="pub" iso-8601-date="2025-07-14">
				<day>14</day>
				<month>07</month>
				<year>2025</year>
			</pub-date>
			<pub-date pub-type="collection">
				<year>2025</year>
			</pub-date>
			<volume>8</volume>
			<issue>7</issue>
			<fpage>1</fpage>
			<lpage>8</lpage>
			<history>
				<date date-type="received" iso-8601-date="2025-03-21">
					<day>21</day>
					<month>03</month>
					<year>2025</year>
				</date>
				<date date-type="accepted" iso-8601-date="2025-04-23">
					<day>23</day>
					<month>04</month>
					<year>2025</year>
				</date>
			</history>
			<permissions>
				<copyright-statement>Copyright: &amp;#x00A9; 2022 The Author(s)</copyright-statement>
				<copyright-year>2022</copyright-year>
				<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/4.0/">
					<license-p>
						This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC-BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. See 
						<uri xlink:href="http://creativecommons.org/licenses/by/4.0/">http://creativecommons.org/licenses/by/4.0/</uri>
					</license-p>
					.
				</license>
			</permissions>
			<self-uri xlink:href="https://itech.cifra.science/archive/3-7-2025-july/10.60797/itech.2025.7.7"/>
			<abstract>
				<p>This study introduces an innovative robotic system that integrates AI vision, a robotic arm, and Mecanum wheels to address challenges in dynamic environments requiring precise navigation and object manipulation. The primary objective is to develop a unified architecture that combines hardware and software for real-time object detection, tracking, and synchronized control of the robotic arm and mobile platform.The system features a custom-built mobile platform equipped with Mecanum wheels powered by 12V DC motors, enabling omni-directional mobility for navigating complex terrains. Arduino UNO serves as the central controller and execute control algorithms. The system’s performance was evaluated in a controlled 5m × 5m environment, demonstrating high accuracy in calibration, obstacle avoidance, dynamic tracking.Results indicate the Mecanum wheels ensured efficient navigation in cluttered spaces. This paper highlights the potential of AI-driven adaptability in robotics, offering applications in manufacturing, healthcare, logistics, and beyond. Future work could explore enhanced AI models for more complex object recognition tasks, increased payload capacity, and improved energy efficiency, further extending the system's applicability in real-world scenarios.</p>
			</abstract>
			<kwd-group>
				<kwd>Mecanum wheel</kwd>
				<kwd> Robotic system</kwd>
				<kwd> MIT app</kwd>
				<kwd> Arduino UNO</kwd>
			</kwd-group>
		</article-meta>
	</front>
	<body>
		<sec>
			<title>HTML-content</title>
			<p>1. Introduction</p>
			<p>The increasing need for automation in various industries, such as manufacturing, logistics, and healthcare, has accelerated tremendous growth in robotics. Yet, the efficient use of robots in dynamic and uncertain environments is still a major challenge. Such environments tend to require accurate navigation, real-time object perception, and dexterous manipulation capabilities </p>
			<p>[1][8][4][2][7][3][5][6]</p>
			<p>2. Research methods and principles</p>
			<p>Voice-operated robotic systems based on Arduino microcontrollers have attracted considerable interest because of their possible uses in environments that are inaccessible or dangerous to humans. The systems utilize voice commands to operate robotic movements, providing a hands-free and user-friendly interface for users.</p>
			<fig id="F1">
				<label>Figure 1</label>
				<caption>
					<p>Block diagram of a Line Follower Robot</p>
				</caption>
				<alt-text>Block diagram of a Line Follower Robot</alt-text>
				<graphic ns0:href="/media/images/2025-03-27/cbf78c71-da6f-4c66-b30d-18e75066ff90.jpg"/>
			</fig>
			<fig id="F2">
				<label>Figure 2</label>
				<caption>
					<p>Line Follower Robot controlled by a smartphone using MIT App Inventor</p>
				</caption>
				<alt-text>Line Follower Robot controlled by a smartphone using MIT App Inventor</alt-text>
				<graphic ns0:href="/media/images/2025-03-21/9ef7cdb8-6516-4763-94dc-2002acba1155.jpg"/>
			</fig>
			<fig id="F3">
				<label>Figure 3</label>
				<caption>
					<p>MIT App Inventor interface</p>
				</caption>
				<alt-text>MIT App Inventor interface</alt-text>
				<graphic ns0:href="/media/images/2025-07-18/7b5cdb4c-72f3-4cbd-b816-6f681b0be3fb.jpg"/>
			</fig>
			<fig id="F4">
				<label>Figure 4</label>
				<caption>
					<p>Development environment of MIT App Inventor</p>
				</caption>
				<alt-text>Development environment of MIT App Inventor</alt-text>
				<graphic ns0:href="/media/images/2025-07-18/181e22e5-229a-4733-a6b0-079bce604915.jpg"/>
			</fig>
			<fig id="F5">
				<label>Figure 5</label>
				<caption>
					<p>Bluetooth connection process</p>
				</caption>
				<alt-text>Bluetooth connection process</alt-text>
				<graphic ns0:href="/media/images/2025-07-18/2e1351be-9ce8-48cc-956b-bd2036aafb99.png"/>
			</fig>
			<fig id="F6">
				<label>Figure 6</label>
				<caption>
					<p>Control logic for the proposed system</p>
				</caption>
				<alt-text>Control logic for the proposed system</alt-text>
				<graphic ns0:href="/media/images/2025-07-18/10eefa28-df0c-4f1a-bba3-5330f5973525.png"/>
			</fig>
			<p>The diagram 5 shows a flowchart for the Bluetooth connection process in an application. The flowchart starts with a scan for devices. If a device is detected, the user will be prompted to connect. When connected, the application goes into a data transmit or receive state. In case of a lost connection, the application tries to reconnect. In case a reconnection is not possible, the application ceases to scan for devices and provides a notification. This flowchart in figure 6 depicts control logic for a system that communicates with a microcontroller through Bluetooth. The process starts with an initial value being sent to the microcontroller. User input is accepted via direction buttons and a stop button. When a direction button is clicked and a Bluetooth connection is made, the system determines the proper Pulse Width Modulation (PWM) and direction values. These are then sent to the microcontroller, and the respective button's background color changes to give feedback. If the stop button is clicked and a Bluetooth connection exists, the system resets the PWM and direction to zero, indicating the microcontroller to start braking or stopping. At the same time, the system shows the corresponding information like speed and other relevant data on the screen. If the Bluetooth connection is lost at any stage, the system suspends and waits for the re-establishment of the connection before it accepts additional commands. The process terminates when the stop button is clicked and the system successfully sends the braking/stopping data to the microcontroller. This flowchart illustrates the key control logic, which is responsible for safe and controlled functioning based on user input while having a constant communication channel with the microcontroller.</p>
			<p>3. Main results</p>
			<p>Voice-controlled systems are especially helpful in controlling electronic devices, where users can turn on or off devices with voice commands via a Bluetooth connection on a mobile phone. This method not only makes it easy to control electronic devices but also assists in saving electricity by turning off devices when they are not in use. Further, voice-controlled robots with object identification and picking skills are also in the making to further augment automation in sectors like warehousing and healthcare, illustrating the adaptability and potential of these systems.</p>
			<fig id="F7">
				<label>Figure 7</label>
				<caption>
					<p>Line follower robot prototype model</p>
				</caption>
				<alt-text>Line follower robot prototype model</alt-text>
				<graphic ns0:href="/media/images/2025-03-21/d59ebe91-a1a4-4391-9515-977117a34d5d.jpg"/>
			</fig>
			<p>The essential parts are an Arduino Uno microcontroller, a motor driver, DC motors, batteries, and an ultrasonic sensor. The Arduino Uno is the brain, taking input from the ultrasonic sensor and controlling the motor driver to make the robot move. The motor driver, supplied by the batteries, controls the speed and direction of the DC motors, making the robot move. The ultrasonic sensor gives the distance readouts, enabling the robot to navigate around objects. The chassis, probably acrylic, gives a transparent view of the inner workings. Although the wiring is mostly tidy, there are spots that can use some reorganization. The robot, overall, shows a seamless integration with possibilities for future development and personalization because of the use of the very adaptable Arduino Uno platform.</p>
			<p>4. Discussion</p>
			<p>The system controlled by Bluetooth performed successful operation based on the designed flowchart. Direction button and stop button input from the user was successfully converted into control signals, and a stable Bluetooth connection provided consistent communication with the microcontroller. The system displayed precise motor control, responding well to direction changes and performing the stop command well. Visual feedback through color-changing button colors and on-screen data improved user experience and awareness. Simple error handling mechanisms were implemented to deal with Bluetooth disconnections, providing safe operation by suspending commands until reconnection occurred. In general, the system proved successful, offering an intuitive and robust interface for the control of a microcontroller-based system over Bluetooth.</p>
			<p>5. Conclusion</p>
			<p>Voice-controlled robots have numerous applications, including military operations, home security, rescue missions, and medical assistance. They are particularly useful in situations where human presence is risky or impractical. They are also useful in helping disabled people, allowing them a method of controlling devices without physical interaction. Although existing systems exhibit good voice control, there are still issues in enhancing voice recognition accuracy, particularly in noisy settings. Future studies may address how to make voice recognition more robust and increase the capability of such robots to accomplish more sophisticated tasks. Moreover, incorporating more advanced technologies like natural language processing could further enhance human-robot interaction.</p>
		</sec>
		<sec sec-type="supplementary-material">
			<title>Additional File</title>
			<p>The additional file for this article can be found as follows:</p>
			<supplementary-material xmlns:xlink="http://www.w3.org/1999/xlink" id="S1" xlink:href="https://doi.org/10.5334/cpsy.78.s1">
				<!--[<inline-supplementary-material xlink:title="local_file" xlink:href="https://itech.cifra.science/media/articles/19001.docx">19001.docx</inline-supplementary-material>]-->
				<!--[<inline-supplementary-material xlink:title="local_file" xlink:href="https://itech.cifra.science/media/articles/19001.pdf">19001.pdf</inline-supplementary-material>]-->
				<label>Online Supplementary Material</label>
				<caption>
					<p>
						Further description of analytic pipeline and patient demographic information. DOI:
						<italic>
							<uri>https://doi.org/10.60797/itech.2025.7.7</uri>
						</italic>
					</p>
				</caption>
			</supplementary-material>
		</sec>
	</body>
	<back>
		<ack>
			<title>Acknowledgements</title>
			<p>The authors would like to thank AISSMS Institute of Information Technology for providing the resources and facilities necessary to complete this research.</p>
		</ack>
		<sec>
			<title>Competing Interests</title>
			<p/>
		</sec>
		<ref-list>
			<ref id="B1">
				<label>1</label>
				<mixed-citation publication-type="confproc">Alzaydi A. Human-Robot Interaction in Saudi Arabia's E-Mobility Transition — A Literature Review / A. Alzaydi, K. Abedalrhman, M. Ismail [et al.] // Social Science Journal for Advanced Research. — 2024. — № 4. — P. 74–96.</mixed-citation>
			</ref>
			<ref id="B2">
				<label>2</label>
				<mixed-citation publication-type="confproc">Cheng P. A deep learning-enhanced multi-modal sensing platform for robust human object detection and tracking in challenging environments / P. Cheng, Z. Xiong, Y. Bao [et al.] // Electronics. — 2023. — № 12 (16). — P. 3423. — DOI: 10.3390/electronics12163423.</mixed-citation>
			</ref>
			<ref id="B3">
				<label>3</label>
				<mixed-citation publication-type="confproc">Chikhale M.V. Voice controlled robotic system using Arduino microcontroller / M.V. Chikhale, M.R. Gharat, M.S. Gogate [et al.] // International Journal of New Technology and Research. — 2017. — № 3 (4). — P. 263302.</mixed-citation>
			</ref>
			<ref id="B4">
				<label>4</label>
				<mixed-citation publication-type="confproc">Liu C. A multitasking-oriented robot arm motion planning scheme based on deep reinforcement learning and twin synchro-control / C. Liu, J. Gao, Y. Bi [et al.] // Sensors. — 2020. — № 20 (12). — P. 3515. — DOI: 10.3390/s20123515.</mixed-citation>
			</ref>
			<ref id="B5">
				<label>5</label>
				<mixed-citation publication-type="confproc">Mohapatra B.N. Implementation of a line follower robot using microcontroller / B.N. Mohapatra, K.U.J. Husain, R.K. Mohapatra // International Journal of Innovative Technology and Exploring Engineering. — 2019. — № 9 (2). — P. 2155–2158.</mixed-citation>
			</ref>
			<ref id="B6">
				<label>6</label>
				<mixed-citation publication-type="confproc">Mohapatra B.N. Design of an automated agricultural robot and its prime issues / B.N. Mohapatra, R.K. Mohapatra. — 2020.</mixed-citation>
			</ref>
			<ref id="B7">
				<label>7</label>
				<mixed-citation publication-type="confproc">Taheri H. Omnidirectional mobile robots, mechanisms and navigation approaches / H. Taheri, C.X. Zhao // Mechanism and Machine Theory. — 2020. — № 153. — P. 103958. — DOI: 10.1016/j.mechmachtheory.2020.103958.</mixed-citation>
			</ref>
			<ref id="B8">
				<label>8</label>
				<mixed-citation publication-type="confproc">Zhang L. RGB-D Camera-Based Depth Measurement of Castings in Dynamic Environments / L. Zhang, Z. Chen, J. Miao [et al.] // International Journal of Metalcasting. — 2024. — P. 1–14. — DOI: 10.1007/s40962-024-00758-8.</mixed-citation>
			</ref>
			<ref id="B9">
				<label>9</label>
				<mixed-citation publication-type="confproc">Kim J. AI-driven robotic perception for smart automation / J. Kim, H. Park, S. Lee [et al.] // IEEE Transactions on Industrial Electronics. — 2021. — № 68 (9). — P. 7654–7665. — DOI: 10.1109/TIE.2021.3064123.</mixed-citation>
			</ref>
			<ref id="B10">
				<label>10</label>
				<mixed-citation publication-type="confproc">Gupta R. Deep learning approaches for humanoid robotics / R. Gupta, A. Singh // International Journal of Artificial Intelligence and Robotics. — 2022. — № 5 (3). — P. 120-135. — DOI: 10.1016/j.ijair.2022.04.007.</mixed-citation>
			</ref>
			<ref id="B11">
				<label>11</label>
				<mixed-citation publication-type="confproc">Johnson M. Sensor fusion techniques in mobile robotics / M. Johnson, K. Patel // Robotics and Automation Letters. — 2020. — № 4 (2). — P. 1125–1130. — DOI: 10.1109/LRA.2020.2976598.</mixed-citation>
			</ref>
			<ref id="B12">
				<label>12</label>
				<mixed-citation publication-type="confproc">Yadav S. Adaptive control strategies for robotic manipulators / S. Yadav, P. Sharma, D. Mehta // Control Engineering Practice. — 2019. — № 86. — P. 104350. — DOI: 10.1016/j.conengprac.2019.104350.</mixed-citation>
			</ref>
			<ref id="B13">
				<label>13</label>
				<mixed-citation publication-type="confproc">Wang H. Neural network-based control of autonomous robots / H. Wang, Q. Zhou, X. Liu // IEEE Transactions on Cybernetics. — 2023. — № 53 (4). — P. 2098–2110. — DOI: 10.1109/TCYB.2023.3184075.</mixed-citation>
			</ref>
			<ref id="B14">
				<label>14</label>
				<mixed-citation publication-type="confproc">Dutta R. Real-time object recognition using CNN for robotic applications / R. Dutta, P. Banerjee // Journal of Robotics and Automation. — 2021. — № 7 (1). — P. 45–57. — DOI: 10.1109/JRA.2021.3135074.</mixed-citation>
			</ref>
			<ref id="B15">
				<label>15</label>
				<mixed-citation publication-type="confproc">Thomas A. Path planning algorithms for robotic navigation / A. Thomas, R. White // International Journal of Robotics Research. — 2018. — № 37 (5-6). — P. 594–612. — DOI: 10.1177/0278364918777749.</mixed-citation>
			</ref>
			<ref id="B16">
				<label>16</label>
				<mixed-citation publication-type="confproc">Singh P. Human-robot collaboration in industrial settings / P. Singh, K. Verma // Manufacturing Engineering Journal. — 2022. — № 10 (3). — P. 150–164. — DOI: 10.1016/j.mej.2022.03.004.</mixed-citation>
			</ref>
			<ref id="B17">
				<label>17</label>
				<mixed-citation publication-type="confproc">Lopez M. Deep reinforcement learning for robotic manipulation / M. Lopez, J. Fernandez // Advances in Robotics and AI. — 2020. — № 12 (7). — P. 228–245. — DOI: 10.1016/j.advrobot.2020.04.008.</mixed-citation>
			</ref>
			<ref id="B18">
				<label>18</label>
				<mixed-citation publication-type="confproc">Zhang Y. Multi-agent robotic coordination using AI / Y. Zhang, P. Li, T. Wu // Journal of Intelligent Systems. — 2019. — № 18 (2). — P. 67–85. — DOI: 10.1016/j.jisys.2019.02.001.</mixed-citation>
			</ref>
			<ref id="B19">
				<label>19</label>
				<mixed-citation publication-type="confproc">Kumar S. Cyber-physical systems and robotics: Emerging trends / S. Kumar, R. Agarwal // International Journal of Emerging Technologies in Robotics. — 2023. — № 15 (1). — P. 10–24. — DOI: 10.1016/j.ijetr.2023.01.002.</mixed-citation>
			</ref>
			<ref id="B20">
				<label>20</label>
				<mixed-citation publication-type="confproc">Xie B. Self-learning algorithms for mobile robots / B. Xie, J. Wang // Robotics and Autonomous Systems. — 2021. — № 138. — P. 103692. — DOI: 10.1016/j.robot.2021.103692.</mixed-citation>
			</ref>
		</ref-list>
	</back>
	<fundings/>
</article>