Bose Introduces 3D Printed Prototype Augmented Reality Sunglasses

Pin It

You’ve probably heard of Google Glass, a brand of optical smart glass that displays information in a smartphone-like, hands-free format. This product, as is the case with many other smart glasses, works by overlaying digital objects and information over your real-world view out of the glasses. This, in my opinion, seems like it might be rather distracting, even if they do offer a useful service like counting calories. Augmented reality (AR) glasses have viable applications as well in design and engineering and are becoming more integrated into smart manufacturing workflows.

Audio equipment company Bose, well-known for its high-end speakers and headphones, has developed sunglasses that provide context and data through sound in order to augment reality, rather than visual information through a camera and a screen.

The company introduced its augmented reality sunglasses at SXSW (South by Southwest) in Texas this week, which is where many innovative companies choose to debut their latest creative work each year.

The AR sunglasses look pretty much like any other pair of sunglasses, and according to a Mashable article by Raymond Wong, the 3D printed prototype pair he tried on was “specifically designed to not draw attention in public.”

“Even though they were prototypes, I was impressed by the fit,” Wong wrote. “They’re super light and don’t weigh your face down. All of the electronics, including the battery, are stored inside of the stems.”

Like many other companies, Bose chose to 3D print its product prototype, rather than use a conventional manufacturing method. This can save time and money in the product design and development process, even if the technology won’t work for commercial production.

The Bose AR sunglasses have many great features, including letting you listen to music without bothering the people around you – just like you’re wearing a pair of headphones, instead of sunglasses.

This feat was accomplished by building two narrow directional speakers into the end of each stem. The speakers will send the audio directly into your ears without the use of earbuds, and no one else can hear your music unless they actually press themselves up against the sunglasses…in which case, I suggest you walk away from the person quickly.

The sound quality of the AR sunglasses is also exceptional, which shouldn’t come as a surprise given Bose’s reputation for excellent sound quality.

“I expected the sound to be average given how thin the speakers are, but was honestly blown away by the clarity,” Wong wrote.

The Bose AR sunglasses also know, even without the use of a camera, what you’re looking at when you’re wearing them, thanks to on-board motion sensors that work with GPS coordinates from a paired smartphone to detect the direction you’re facing.

By looking at a specific landmark and double-tapping one of the sunglass stems, you can instantly receive audio information through the speakers. In the future, Bose hopes to partner with content providers and integrate their data, so users will only need to look at something like a restaurant or store to instantly have access to spoken ratings and reviews through the AR sunglasses.

Enter the company’s new $50 million venture fund – Bose is investing in other companies to get some help in building out its AR sunglasses into a viable platform. It’s already got some pretty big names signed on, including Yelp, TuneIn, Trip Advisor, ASICS Studio, and Strava.

[Image: Karissa Bell/Mashable]

Bose’s goal is not only to introduce the services of these content providers onto its own hardware. The company also wants others to, as Wong put it, “build its AR audio tech into other form factors like headphones and helmets.”

One of the coolest features of the Bose AR sunglasses is that they actually recognize gestures you make with your head. For example, if someone is calling you while you’re wearing a pair, you can shake your head to decline the call, or nod to answer it. In addition, you can rotate your head to the left or to the right in order to choose an item off of an audio-based carousel menu.

Before you get too excited about answering your phone just by nodding your head, the Bose AR sunglasses will not have a wide release any time soon. However, limited quantities of the product that have been “tweaked” a little will be released this summer, though we don’t know yet how much they will cost.

The Bose Developer Portal reads, “Imagine a world where everything you see is more valuable, more emotional, and more meaningful — because of what you hear. Introducing Bose AR, the world’s first audio augmented reality platform.”

An SDK of the Bose AR sunglasses will also be available this summer for developers.

What do you think of these augmented reality sunglasses? Discuss this and other 3D printing topics at or share your thoughts in the Facebook comments below.

[Images: Raymond Wong/Mashable, unless otherwise credited]

RoMA: Robotic 3D Printing and Augmented Reality Combine in Interactive Fabrication Platform

Pin It

Huaishu Peng

We often see robotics and 3D printing combined, as well as 3D printing and augmented reality (AR). But Cornell University researcher Huaishu Peng, whose 3D printing work has made headlines multiple times, has been working on a project that combines all three. We see a wide variety of really interesting and innovative research and projects in this field, but I can honestly say that the Robotic Modeling Assistant (RoMA) created by Peng and his team is one of the coolest I’ve come across.

According to Peng’s website, he is interested in the technical aspects of human-computer interaction (HCI), and designs software and hardware systems to enable 3D modeling with interactive experiences, as well as making functional objects using custom fabrication machines.

Peng wrote, “I envision that in the future (1) people will design both the form and the function of everyday objects and (2) a personal fabrication machine will construct not only the 3D appearance, but also the interactivity of its prints.

Talk about interactive – the RoMA is a fabrication system that gives users a hands-on, in-situ 3D modeling experience, using a robotic arm 3D printer and an AR CAD editor.

Peng, together with fellow Cornell researchers Jimmy Briggs, Cheng-Yao Wang, and Kevin Guo; Joseph Kider with the University of Central Florida; Stefanie Mueller with MIT CSAIL; Patrick Baudisch from the Hasso-Plattner Institute; and Cornell’s François Guimbretière, wrote a paper on the RoMA.

The abstract reads, “We present the Robotic Modeling Assistant (RoMA), an interactive fabrication system providing a fast, precise, hands-on and in-situ modeling experience. As a designer creates a new model using RoMA AR CAD editor, features are constructed concurrently by a 3D printing robotic arm sharing the same design volume. The partially printed physical model then serves as a tangible reference for the designer as she adds new elements to her design. RoMA’s proxemics-inspired handshake mechanism between the designer and the 3D printing robotic arm allows the designer to quickly interrupt printing to access a printed area or to indicate that the robot can take full control of the model to finish printing. RoMA lets users integrate real-world constraints into a design rapidly, allowing them to create well-proportioned tangible artifacts or to extend existing objects. We conclude by presenting the strengths and limitations of our current design.”

Basically, as a designer is using RoMA’s AR CAD editor to draw a new 3D model in the air, a 3D printing robotic arm is building features to augment the model at the same time, in the same design volume.

Then, the partially 3D printed model can act as the designer’s physical point of reference while they continue to add elements to the design.

According to the paper, “To use the RoMA system, a designer wears an Augmented Reality (AR) headset and starts designing inside the print volume using a pair of AR controllers. As soon as a design feature is completed, the RoMA robotic arm prints the new feature onsite, starting in the back half of the design volume. At any time, the designer can bring printed features into the front half of the design volume for use as a physical reference. As she does so, the robot updates its schedule and prints another available part of the model. Once she finishes a design, the designer steps back, allowing the robotic system to take full control of the build platform to finish printing.”

So while it may appear to an onlooker that the designer is pointing the AR controller at nothing, they are really designing a 3D model on the rotating platform below the robotic arm. Then, the arm will 3D print each completed design feature in what appears to be mid-air, around the model that only the designer wearing the headset can see.

Want to build a stand for your model jet, a garage for your LEGO vehicle, or a teapot with a finger hole perfectly designed to fit your finger? RoMA can get the job done. It’s almost like a 3D printing pen, but on a much larger scale, with AR technology and a robotic arm controlling the 3D printing process.

Augmented reality interaction

RoMA users are able to, according to the project page, “integrate real-world constraints into a design rapidly, allowing them to create well-proportioned tangible artifacts,” and even extend an object through in-situ fabrication.

The system includes a ceiling-mounted Adept S850 6DOF robotic arm 3D printer, a rotating platform, and an AR headset with cutter and indicator controllers. In terms of software, RoMA has:

  • End-to-end pipeline which integrates AR and robot control
  • Custom AR CAD editor
  • Proxemics-inspired handshake mechanism, which supports human-robot interaction

The custom AR modeling tool emphasizes interactive design, similar to SketchUp, and is deeply integrated with Rhino CAD modeling software.

To begin the process, the designer needs to stay close to the rotating build platform, which is kept immobile by the 3D printing system. The system then 3D prints the part of the model that’s located in the back half of the platform.

To bring the model forward, all the designer has to do is touch the platform’s handle and rotate it.

The robot arm will automatically park away from the user, until the designer steps away. Then, the robotic fabricator is free to “take the full control of the platform” and complete the build.

Any strings left behind from the robotic arm’s 3D printing job can be easily removed with the system’s cutter controller.

[embedded content]

Discuss this and other 3D printing topics at or share your thoughts below. 

[Source/Images: Huaishu Peng]

BMW combines 3D printing & virtual reality to streamline vehicle design

Mar 29, 2017 | By Tess

German auto manufacturer BMW, no stranger to 3D printing technologies, has announced its intention to combine additive manufacturing and virtual reality to help streamline and reduce the costs of its design processes.

3D printing and virtual reality have been developing side by side for several years, with both technologies becoming more and more advanced and increasingly accessible. It is hardly a surprise then that their trajectories have become intertwined in numerous ways. Earlier today we wrote about one instance of this intersection, as tech company HTC released its new MakeVR tool, which allows HTC Vive users to craft and 3D model in a virtual environment.

Now, it seems BMW is seeking to explore the benefits of combining both technologies for its own design-related purposes. In designing and developing a new vehicle, BMW would traditionally have to manufacture one or several prototypes for each part—a time-consuming and costly process. With the advancement of 3D printing, however, this task was made significantly easier, as the company was able to additively manufacture one-off prototypes in a more time and cost efficient manner.

By adding virtual reality into the mix, the car manufacturer is hoping to streamline its design and prototyping process even more. That is, in combining VR tech with 3D printing, BMW is confident that it can simplify and speed up its auto design stage by cutting back on the number of parts that even need to be additively manufactured.

How is this going to work? Well, BMW is reportedly working on a VR program (in collaboration with Unreal Engine) that is capable of recreating a variety of different surface finishes and features that are integrated into BMW’s vehicles. Using the VR technology, the company plans to project the virtual images onto 3D printed parts to see how they will look when they are finished and built into the car. This will allow BMW’s designers to see any early flaws with a particular design, and allow them to create and adapt a new virtual design.

Additionally, BMW also intends to use virtual reality and 3D printing in tandem in order to increase the efficiency of inter-departmental communications. By using the two technologies together, BMW says it will be easier to convey design ideas and directions to different teams, and will provide a more user-friendly experience for its employees.

For over 25 years, BMW has been a strong proponent of additive manufacturing technologies, not only using it for its own manufacturing needs, but also investing in up-and-coming 3D printing companies, and collaborating with various organizations, including Team USA. As always, we are eager to see its continued use and advancement of the technology.

Posted in 3D Printing Application

Maybe you also like:

EchoPixel Announces True 3D Print Support, Combining Virtual Reality and 3D Model Printing to …

/ — MOUNTAIN VIEW, CA–(Marketwired – Feb 21, 2017) – EchoPixel today announced True 3D print support, a breakthrough set of software tools designed to assist physicians using models they create using their 3D printers Built on EchoPixel’s True 3D Viewer software, the workflows allow medical professionals to visualize and interact with patient specific anatomy that can be directly converted into 3D printed models. This allows professionals to create their models with greater quality and accuracy, and to “print right, the first time”.

EchoPixel’s existing True 3D Viewer software enables physicians to see and interact with medical images the way they would with real, physical objects. The system converts existing DICOM datasets into life-size virtual reality objects, allowing physicians to move, turn, dissect, and cut open virtual patient anatomy. The new software tools facilitate seamless transition to printing of 3D models, once a professional has determined the desired anatomy and orientation to print.

“We believe there’s a revolution happening in 3D medical modeling, and it’s just getting started,” said Ron Schilling, CEO of EchoPixel. “3D printing is a game changing technology, but it’s not yet accepted as a widely effective clinical tool, primarily due to the cost and time restrictions. EchoPixel’s Interactive Virtual Reality is a complementary technology that can enable truly effective 3D modeling for the first time. It has the potential to dramatically reduce time and cost investments.”

When supported by EchoPixel’s software tools, 3D modeling has numerous potential benefits across a range of medical applications. These may include:

  • Improved communication and collaboration among different members of the surgical team, including surgeons and other OR staff
  • Enhanced pre-operative planning and better interactive understanding of unique anatomy that can be used as a reference during surgery
  • Mirror-image modeling used for reconstruction templates
  • Practice on models for surgical residents resident work hours
  • Increased patient education

“We’re excited to establish 3D virtual viewing as part of our 3D program,” said Steve Muyskens, M.D., cardiologist at Cook Children’s Medical Center in Fort Worth, Texas. “Having this technology, in addition to 3D printing capabilities, allows Cook Children’s cardiologists and cardiothoracic surgeons to improve the planning of complex procedures and surgeries. We believe this approach will eventually lead to less time in the operating room and fewer complications.”

EchoPixel will be demonstrating its True 3D Viewer and its Print Support functionality in HP’s booth #1979 at HIMSS 2017. 

About EchoPixel
EchoPixel is building a new world of patient care with its groundbreaking medical visualization software. The company’s FDA-cleared True 3D Viewer uses existing medical image datasets to create virtual reality environments of patient-specific anatomy, allowing physicians to view and dissect images just as they would real, physical objects. The technology aims to make reading medical images more intuitive, help physicians reach diagnosis, and assist in surgical planning. Leading institutions, including the University of California, San Francisco, the Cleveland Clinic, the Lahey Clinic, and more are using True 3D in clinical and research applications. EchoPixel is a privately held, venture backed company located in Mountain View, CA.