NUHS Embarks on Holomedicine Research in Singapore, Using Mixed Reality Technology to Enhance Diagnosis, Education and Patient Care
The National University Health System (NUHS) has embarked on a research and development programme within the academic healthcare cluster to explore the use of MR technology in clinical care. The research programme hopes to support the development of next-generation clinical applications and improve patient safety. This would augment clinical processes, and enhance both undergraduate and postgraduate education. While the use of holographic technology in operating theatres is still nascent, NUHS hopes to apply it in multiple fields of surgery.
“Holographic technology may radically transform the way we practice medicine. Early adoption will place NUHS at the forefront of medical MR research and position us as a pioneer in the clinical use of this technology,” said Associate Professor Ngiam Kee Yuan, NUHS’s Group Chief Technology Officer, who is overseeing the research and development of the holomedicine programme in NUHS.
While holomedicine is new in the field of medicine, it has become increasingly prominent in the past year. “It leverages the concept of MR not only to augment our physical environment, but permits interaction with virtual objects superimposed onto the real world. The virtual objects can also be manipulated relative to the real world using natural hand gestures,” explained Dr Gao Yujia, Associate Consultant with the Division of Hepatobiliary & Pancreatic Surgery, National University Hospital (NUH), and the programme lead for holomedicine at NUHS.
Potential of holographic technology in brain surgery
A team of neurosurgeons at NUH has initiated a study to assess the feasibility of using holographic technology to spatially locate brain tumours when operating on patients.
Using holographic visors, a three-dimensional (3D) hologram of a patient’s brain scan is projected into space and superimposed onto the patient’s head during surgery. This hologram is generated from the patient’s own brain CT scan using a 3D medical software created by apoQlar, called Virtual Surgery Intelligence (VSI). When the hologram is set in place on a patient’s head, surgeons are able to view the 3D holographic images of the brain from different angles.
They can also pull up information and alter the images in the visor by gesture and speech recognition, allowing them to interact with, and control the holographic image that is superimposed onto the patient.
Clinical Associate Professor Yeo Tseng Tsai, Head and Senior Consultant with the Division of Neurosurgery, NUH said, “With this holographic technology, you are able to see inside the brain. You will be able to see the blood vessels and most importantly identify the tumour quickly and precisely, as well as to know which angle and the exact location to make the incision. For over thirty years now, we have been using a handheld navigation system to navigate and identify the location of the tumour. In comparison, this new mixed reality system is more intuitive as we can now see inside the patient’s head without the need to look up and refer to a computer screen while performing a procedure.”
With the potential of higher precision and quicker localisation, patient care and safety can be improved. Furthermore, with the MR headset weighing only around 500 grams, it may one day replace existing bulky operation theatre equipment and even reduce the exposure to radiation in procedures such as spine operations, where X-rays are currently used to guide the insertion of metallic implants.
NUHS has completed the first phase of the research programme involving a proof of concept study, where various disciplines such as neurosurgery, plastics surgery and ophthalmology at NUH have evaluated the MR devices and brainstormed on future areas of development. The headsets were used for evaluation purposes to identify use cases, understanding of clinicians’ acceptance level, system evaluation, as well as assessing its long-term deployment sustainability.
NUHS has developed a holistic holomedicine roadmap, including short- and long-term research projects, integration with existing hospital systems, and procedures to onboard users and enhance the hospital’s infrastructure to support the system. Some potential use cases include streaming of live data from image acquisition machines, and the employment of artificial intelligence and machine learning for advanced image processing and predictive analytics. In addition, the HoloLens 2 device may also be used by patients to help them better understand the procedures they are to undergo. This is achieved by projecting a 3D rendering of a patient’s scan to better illustrate the steps of their procedure.
Clinical validation studies and trials will be necessary prior to the adoption of this holomedicine solution as a primary clinical method. This involves comparing MR technology to current gold standards of clinical practice and outcomes measured, including data on accuracy, stability, and potential risks and limitations of this capability. Registration with the necessary governing bodies including the Health Science Authority (HSA) will also be needed before this solution can be used in a direct patient intervention role.
Collaboration with industry
The NUHS team, in collaboration with the Engineering Design & Innovation Centre under the National University of Singapore’s (NUS) Faculty of Engineering, has also been awarded a grant of $100,000 from the Engineering in Medicine Grant under NUHS and NUS in March 2021. The grant will enable the team to embark on a project on real-time volumetric rendering and positioning of ultrasound scans.
Industry collaboration is an integral part of the research and development of holomedicine in NUHS. NUHS has been collaborating with Microsoft and apoQlar in its holomedicine efforts. Moving forward, NUHS will continue to build on its existing partnerships and explore new industry collaborations to push the boundaries of MR technology in the healthcare environment.
NUHS is also privileged to be a founding member of The Holomedicine Association, an international organisation comprising clinicians, scientists and industry partners who are engaged in active research and development of holomedicine. With 12 members on the founding committee including the executive board of the association, NUHS is well poised to participate in collaborations with other hospitals, research institutes, and industry partners around the world.
“We are not just an end-user of holomedicine; we hope to be an active developer and to validate its use in medicine. We will work with our partners to make holomedicine customisable, more user-friendly and clinically applicable,” said A/Prof Ngiam Kee Yuan.
“The experience of the past year shows us that technology can empower healthcare workers and assist them to help protect and save the lives of patients. NUHS’s holomedicine study bears further testament to how innovative use of technology, such as Mixed Reality solutions and Microsoft HoloLens 2, may have a truly transformational impact in healthcare. Technology-enabled neurosurgeons may be able to perform safer procedures, enable improved outcomes and ultimately provide better patient care. We are proud to work with NUHS and other clinicians in the region to explore the full potential of digital technology in holomedicine across multiple therapy areas in the years to come,” said Dr. Keren Priyadarshini, Regional Business Lead, Worldwide Health, Microsoft Asia.
“Holomedicine has the potential to sustainably revolutionise medical standards in most – if not all – medical fields. We are very excited to join forces with NUHS, a world-renowned academic health system, to achieve our mission of further improving patient outcomes and safety,” said Mr Sirko Pelzl, CEO, apoQlar.
The software by apoQlar has successfully obtained HSA’s Class A License and the Singapore Standard SS620 certification in April 2021. NUHS hopes to promulgate holographic technology to all its hospitals and institutes within the healthcare cluster in the near future.
“The advancements in technology, especially in the field of MR, has opened new horizons to the medical profession. By merging other technologies such as artificial intelligence, real-time image recognition, and predictive modelling, these MR devices offer clinicians capabilities that were once thought impossible,” said A/Prof Ngiam Kee Yuan.
Gwangju Institute of Science and Technology and MIT Researchers Develop a Natural and Comfortable “Seamless-walk” Virtual Reality Locomotion System
Limited physical spaces in modern, urban life pose locomotion challenges. Virtual reality (VR) translates such constrained real spaces to larger, virtual spaces using efficient locomotion systems. However, current VR locomotion systems are uncomfortable and raise privacy concerns. To tackle this, researchers from Gwangju Institute of Science and Technology, Korea and MIT CSAIL have now developed “Seamless-walk,” a foot-based VR locomotion system that offers a natural and comfortable experience for applications in VR gaming and healthcare.
Urban real-world environments have limited physical space for foot-based locomotion and present challenges to natural VR locomotion (since virtual environments are much larger than the corresponding real-world environment), a fact that has been noted in past studies (Mandal 2013; Pai and Kunze 2017). To compensate for this challenge, efficient virtual reality (VR)-based locomotion techniques have been proposed to enable natural and immersive locomotion experiences akin to walking in large, virtual environments. However, the VR locomotion systems often require attaching an equipment to the body or video-recording the user’s body pose. This leads to discomfort caused by equipment size and discontinuous adjustment as well as privacy concerns related to capturing the entire body without blind spots. Against this backdrop, researchers from Gwangju Institute of Science and Technology (GIST), Korea in collaboration with researchers from Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory (MIT CSAIL), USA developed a novel foot-based VR locomotion system, called “Seamless-walk,” that offers a more natural and comfortable locomotion experience without requiring the use of any walking equipment or a video of the user’s body pose during walking or interaction with objects using their hands.
In their recent article published online on 17 January 2023 in the journal Virtual Reality, the researchers, led by Dr. Kyung-Joong Kim, Associate Professor at GIST, have detailed the development of the VR locomotion system. “When we started collaborating with MIT, they introduced an interesting new sensor called the ‘intelligent carpet.’ In our view, it was a great opportunity as well as a challenge for us since it had not been developed for any specific application. Therefore, we wanted to make something practical and interesting with this sensor and our AI technology,” explains Dr. Kim. “Accordingly, we decided to develop a VR game controller with the ‘intelligent carpet’ sensor that would be useful in VR gaming.”
“Seamless-walk” has both immediate and long-term potential applications. “In the long run, we believe that our technology could be used in healthcare. ‘Seamless-walk’ is not only a VR gamepad but also a gait recognition and analysis method,” says Dr. Kim.
Seamless-walk works in the following manner: the intelligent carpet captures high-resolution foot pressure imprints in real time as the user moves around by measuring the applied pressure through resistance changes. The footprint information is then fed into a machine learning model that extracts the strong pressure points using a technique called “K-means clustering.” In this method, the pressure points are divided into two clusters, corresponding to the user’s left and right feet. From these clusters, the user’s body direction and foot intervals are then extracted to estimate the angle and movement speed.
Moreover, Seamless-walk has a modular structure that enables a scalable and inexpensive installation of a touch sensing platform. The team conducted tests on 80 individuals using Seamless-walk in a 3D virtual world exploration game, demonstrating that the novel technology of the system guarantees an immersive, natural, and comfortable experience. At the same time, it does not compromise the overall VR experience, outperforming existing VR locomotion methods.
“In the future, we plan to add more detailed gait analysis functions to the current system. This would enhance our sensor and gait analysis system to provide fall detection and health monitoring in a comfortable manner without any privacy issues,” highlights Dr. Kim. “This method could also be used at the gym for monitoring the gait of users on the treadmill or checking their balance during weight training.”
Taken together, this novel development has the potential to advance gait analysis in VR gaming and healthcare.
XLA REVEALS ‘METASITES’ 3D INTERNET FRAMEWORK
XLA, a community-driven organization of video game and entertainment industry professionals, will today announce ‘Metasites’; a modular 3D internet framework leveraging metaverse, cloud, and AI technologies.
Funded with an initial $100M USD, XLA Metasites empowers users to transform virtual worlds by creating unique content. XLA Chief Product Officer Alexey Savchenko will present a demo of Metasites at the Game Developers Conference (GDC) in San Francisco this afternoon during a panel called “Emerging Approaches: What the Metaverse can teach us about Social and Narrative Design through Synthetic Media”.
The presentation will reveal Metasites’ high fidelity graphics, core experience, instant access to the system in the cloud through the browser, OpenAI’s GPT-3.5 integration powering non-active characters and their behavior, virtual assistant IVEE, ability to semantically analyze text and generate 3D scenes in UE5, and many other features. Metasites allows creators to deploy Unreal Engine 5 content and connect to the XLA ecosystem of features and services.
“We call them Metasites, and the idea here is that every one of these locations is part of an infinite network of 3D, explorable spaces. Like hyperlinks, portals connect these worlds, and the core infrastructure operates on a series of open protocols and standards – much like today’s web,” Savchenko will say. “We strongly believe that the future of the internet is in 3D and are looking to partner up with creators and brands from all spheres and industries to build it together.”
Access to Metasites for individual creators and enterprise clients will be based on a revenue share model, providing every user full control and management of their content. Users can access virtual assets and various tools, including a versatile SDK, experiences editor, inventory and item management system, payment processing, marketing tools, and more. XLA also aims to provide a seamlessly interoperable encounter related to digital items – with various functionalities connected to inventory objects within different environments.
“What we’re creating,” Savchenko will say, “is an evolution of the internet and new form of storytelling – authors shaping worlds that create their agency, presence, and stories, accessible across multiple platforms. It’s not about bringing games to new audiences. It’s about making it easier to communicate, educate, commerce, play, and create while being justly rewarded for your actions.”
Metasites will be revealed at GDC at 4:40 PM PDT in Room 24, West Hall, Moscone Center, alongside Hexagram Founder and CEO Rob Auten. The framework is set for closed beta in summer 2023, with enterprise and creators programs opening next quarter and full commercial release expected in Q4 2023.
NYX Soulmate Launches Match for Web3
NYX, a Web3 startup leveraging NFTs, is excited to announce the launch of Soulmate, an AI-powered match platform for Web3. NYX Soulmate utilizes a proprietary artificial Intelligence (AI), and hard science to match users based on who they are, not how they look.
Lee, Chief Executive Officer at NYX Soulmate, said: “Web3 avatars and PFPs are nothing but empty shells if they can’t convey to Metaverse entities who we are. Immersion is incomplete without individuation. NYX opens up a world of possibilities towards the individuation of web3 experiences by way of (through) matchmaking.”
Each Soul NFT contains biodata that the AI uses to suggest compatible matches. The platform also rewards participants when they choose to interact socially.
Zr0, Co-founder and Chief Product Officer at NYX Soulmate, said: “We want to leverage web3 to make a positive impact on the fight against the loneliness epidemic.”
Leveraging blockchain technology and smart contracts, SOUL NFTs can be privately owned and integrated across platforms. It breaks the silo, creating a layer of socialization that can be experienced by the user in any NFT community, game or metaverse of their choice.
Soulmate takes users through a process of self-discovery, incorporates fine art into the experience and suggests like-minded people to connect.
Su, Chief Data Scientist at NYX Soulmate, said: “If we look at references like Ready Player One or Snow Crash, they predict a Metaverse that doesn’t rely on selfies. We are building a matchmaking AI to support that.”
Key features and benefits of NYX Soulmate:
- A way to personalize the web3 ‘face’ you show in the Metaverse.
- Make connections with like-minded individuals through matchmaking algorithms.
- Network in business and personal endeavors with a more informed knowledge of others in the Soulmate ecosystem.
- Own the companion fine art PFP with traits directly informed by your personality.
- Custom built discord serving as a town square to Web3 communities. The first server with cross functional networking capabilities.
More details about NYX Soulmate are available on their website.
Blockchain3 weeks ago
The Odsy Foundation Raises $7.5M in Seed Round Led by Blockchange Ventures To Decentralize Access Control in Web3
Blockchain4 weeks ago
Spielworks co-launches its Wombat X accelerator with Cronos, Newcoin, and more, fostering robust Web3 game development
Latest news1 week ago
BBC Studios takes Top Gear and Doctor Who to the Sandbox Metaverse