Simulation Packages Expand Aircraft Design Options

NASA Technology

When engineers explore designs for safer, more fuel efficient, or faster aircraft, they encounter a common problem: they never know exactly what will happen until the vehicle gets off the ground.

“You will never get the complete answer until you build the airplane and fly it,” says Colin Johnson of Desktop Aeronautics. “There are multiple levels of simulation you can do to approximate the vehicle’s performance, however.”

altWhen designing a new air vehicle, computational fluid dynamics, or CFD, comes in very handy for engineers. CFD can predict the flow of fluids and gasses around an object—such as over an aircraft’s wing—by running complex calculations of the fluid physics. This information is helpful in assessing the aircraft’s aerodynamic performance and handling characteristics.

In 2001, after several years of development, NASA released a new approach to CFD called Cart3D. The tool provides designers with an automated, highly accurate computer simulation suite to streamline the conceptual analysis of aerospace vehicles. Specifically, it allows users to perform automated CFD analysis on complex vehicle designs. In 2002, the innovation won NASA’s Software of the Year award.

Michael Aftosmis, one of the developers of Cart3D and a fluid mechanics engineer at Ames Research Center, says the main purpose of the program was to remove the mesh generation bottleneck from CFD. A major benefit of Cart3D is that the mesh, or the grid for analyzing designs, is produced automatically. Traditionally, the mesh has been generated by hand, and requires months or years to produce for complex vehicle configurations. Cart3D’s automated volume mesh generation enables even the most complex geometries to be modeled hundreds of times faster, usually within seconds. “It allows a novice user to get the same quality results as an expert,” says Aftosmis.

Now, a decade later, NASA continues to enhance Cart3D to meet users’ needs for speed, power, and flexibility. Cart3D provides the best of both worlds—the payoff of using a complex, high-fidelity simulation with the ease of use and speed of a much simpler, lower-fidelity simulation method. Aftosmis explains how instead of simulating just one case, Cart3D’s ease of use and automation allows a user to efficiently simulate many cases to understand how a vehicle behaves for a range of conditions. “Cart3D is the first tool that was able to do that successfully,” he says.

At NASA, Aftosmis estimates that 300–400 engineers use the package. “We use it for space vehicle design, supersonic aircraft design, and subsonic aircraft design.”

Technology Transfer

To enable more use of Cart3D for private and commercial aviation entities, the Small Business Innovation Research (SBIR) program at Langley Research Center provided funds to Desktop Aeronautics, based in Palo Alto, California, to build a plug-in to Cart3D that increases the code’s accuracy under particular flow conditions. Aftosmis says Desktop Aeronautics delivered valuable results and made Cart3D more applicable for general use. “Now they are bringing the product to market. This is something we never would have had the time to do at NASA. That’s the way the SBIR process is supposed to work.”

In 2010, Desktop Aeronautics acquired a license from Ames to sell Cart3D. The company further enhanced the software by making it cross-platform, incorporated a graphical user interface, and added specialized features to enable extra computation for the analysis of airplanes with engines and exhaust.

“I think it’s going to be game-changing for CFD,” says Aftosmis. “Cart3D is the only commercial simulation tool that can guarantee the accuracy of every solution the user does.”


Benefits

altToday, Desktop Aeronautics employs Cart3D in its consulting services and licenses the spinoff product to clients for in-house use. The company provides commercial licenses and academic licenses for research and development projects.

The software package allows users to perform automated CFD analysis on complex designs and, according to the company, enables geometry acquisition and mesh generation to be performed within a few minutes on most desktop computers.

Simulations generated by Cart3D are assisting organizations in the design of subsonic aircraft, space planes, spacecraft, and high speed commercial jets. Customers are able to simulate the efficiency of designs through performance metrics such as lift-to-drag ratio.

“It will assemble a spectrum of solutions for many different cases, and from that spectrum, the cases that perform best give insight into how to improve one’s design,” says Johnson. “Cart3D’s preeminent benefit is that it’s automated and can handle complex geometry. It’s blazing fast. You push a button, and it takes care of the volume meshing and flow measurement.”

Without building an aircraft, engineers can never be completely certain which design concept will perform best in flight. However, they now have a tool to make the most informed prediction possible.

[Source]

Advertisements

Web Solutions Inspire Cloud Computing Software

NASA Technology

In 2008, a NASA effort to standardize its websites inspired a breakthrough in cloud computing technology. The innovation has spurred the growth of an entire industry in open source cloud services that has already attracted millions in investment and is currently generating hundreds of millions in revenue.

William Eshagh was part of the project in the early days, when it was known as NASA.net. “The feeling was that there was a proliferation of NASA websites and approaches to building them. Everything looked different, and it was all managed differently—it was a fragmented landscape.”

altNASA.net aimed to resolve this problem by providing a standard set of tools and methods for web developers. The developers, in turn, would provide design, code, and functionality for their project while adopting and incorporating NASA’s standardized approach. Says Eshagh, “The basic idea was that the web developer would write their code and upload it to the website, and the website would take care of everything else.”

altEven though the project was relatively narrow in its focus, the developers soon realized that they would need bigger, more foundational tools to accomplish the job. “We were trying to create a ‘platform layer,’ which is the concept of giving your code over to the service. But in order to build that, we actually needed to go a step deeper,” says Eshagh.

That next step was to create an “infrastructure service.” Whereas NASA.net was a platform for dealing with one type of application, an infrastructure service is a more general tool with a simpler purpose: to provide access to computing power. While such computing power could be used to run a service like NASA.net, it could also be used for other applications.

Put another way, what the team came to realize was that they needed to create a cloud computing service. Cloud computing is the delivery of software, processing power, and storage over the Internet. Whether these resources are as ordinary as a library of music files or as complex as a network of supercomputers, cloud computing enables an end user to control them remotely and simply. “The idea is to be able to log on to the service and say ‘I want 10 computers,’ and within a minute, I can start using those computers for any purpose whatsoever,” says Eshagh.

As the scope of the project expanded, NASA.net came to be known as Nebula. Much more than setting standards for Agency web developers, Nebula was intended to provide NASA developers, researchers, and scientists with a wide range of services for accessing and managing the large quantities of data the Agency accumulates every day. This was an enormous undertaking that only a high-powered cloud computing platform could provide.

Raymond O’Brien, former program manager of Nebula, says the project was in some ways ahead of its time. “Back in 2008 and 2009, people were still trying to figure out what ‘cloud’ meant. While lots of people were calling themselves ‘cloud enabled’ or ‘cloud ready,’ there were few real commercial offerings. With so little clarity on the issue, there was an opportunity for us to help fill that vacuum.”


altAs the team built Nebula, one of the most pressing questions they faced was that of open source development, or the practice of building software in full view of the public over the Internet.

On the one hand, proprietary code might have helped the project overcome early hurdles, as commercial software can offer off-the-shelf solutions that speed up development by solving common problems. Proprietary software is sometimes so useful and convenient that the Nebula team wasn’t even sure that they could create the product without relying on closed source solutions at some point.

On the other hand, open source development would facilitate a collaborative environment without borders—literally anyone with the know-how and interest could access the code and improve on it. Because Nebula had evolved into a project that was addressing very general, widespread needs—not just NASA-wide, but potentially worldwide—the possibility of avoiding restrictive licensing agreements by going open source was very attractive.

O’Brien says that broad appeal was an important part of Nebula’s identity. “From the beginning, we wanted this project to involve a very large community—private enterprises, academic institutions, research labs—that would take Nebula and bring it to the next level. It was a dream, a vision. It was that way from the start.”

Despite uncertainties, the development team decided to make Nebula purely open source. Eshagh says the real test for that philosophy came when those constraints were stretched to their limits. “Eventually, we determined that existing open source tools did not fully address Nebula’s requirements,” he says. “But instead of turning to proprietary tools, we decided to write our own.”

The problem was with a component of the software called the cloud controller, or the tool that can turn a single server or pool of servers into many virtual servers, which can then be provisioned remotely using software. In effect, the controller gives an end user access in principle to as much or as little computing power and storage as is needed. Existing tools were either written in the wrong programming language or under the wrong software license.

Within a matter of days, the Nebula team had built a new cloud controller from scratch, in Python (their preferred programming language for the controller), and under an open source license. When the team announced this breakthrough on its blog, they immediately began attracting attention from some of the biggest players in the industry. “We believed we were addressing a general problem that would have broad interest,” says Eshagh. “As it turns out, that prediction couldn’t have been more accurate.”

Technology Transfer

Rackspace Inc., of San Antonio, Texas, was one of the companies most interested in the technology. Rackspace runs the second largest public cloud in the world and was at the time offering computing and storage services using software they had created in-house. Jim Curry, general manager of Rackspace Cloud Builders, says they faced hurdles similar to those NASA faced in building a private cloud. “We tried to use available technology,” he says, “but it couldn’t scale up to meet our needs.”

The engineers at Rackspace wrote their own code for a number of years, but Curry says they didn’t see it as a sustainable activity. “We’re a hosting company—people come to us when they want to run standard server environments with a high level of hosting support that we can offer them. Writing proprietary code for unique technologies is not something we wanted to be doing long-term.”

The developers at Rackspace were fans of open source development and had been looking into open source solutions right at the time the Nebula team announced its new cloud controller. “Just weeks before we were going to announce our own open source project, we saw that what NASA had released looked very similar to what we were trying to do.” Curry reached out to the Nebula team, and within a week the two development teams met and agreed that it made sense to collaborate on the project going forward.

Each of the teams brought something to the table, says Curry. “The nice thing about it was that we were more advanced than NASA in some areas and vice versa, and we each complemented the other very well. For example, NASA was further along with their cloud controller, whereas we were further along on the storage side of things.”

The next step was for each organization to make its code open source so the two teams could launch the project as an independent, open entity. Jim Curry says the team at Rackspace was stunned by the speed at which NASA moved through the process. “Within a period of 30–45 days, NASA completed the process of getting the agreements to have this stuff done. From my perspective, they moved as fast as any company I’ve ever worked with, and it was really impressive to watch.”

The OpenStack project, the successor to Nebula with development from Rackspace, was announced in July 2010. As open source software, OpenStack has attracted a very broad community: nearly 2,500 independent developers and 150 companies are a part of it—including such giants as AT&T, HP, Cisco, Dell, and Intel. Semi-annual developers’ conferences, where members of the development community meet to exchange ideas and explore new directions for the software, now attract over 1,000 participants from about two dozen different countries.

Benefits

Because OpenStack is free, companies who use it to deploy servers do not need to pay licensing fees—fees that can easily total thousands of dollars per server per year. With the number of companies that have already adopted OpenStack, the software has potentially saved millions of dollars in server costs.

“Before OpenStack,” says Curry, “your only option was to pay someone money to solve the problem that OpenStack is addressing today. For people who want it as a solution, who like the idea of consuming open source, they now have an alternative to proprietary options.”

Not only is OpenStack saving money; it is also generating jobs and revenue at a remarkable pace. Curry says that dozens of Rackspace’s 80 cloud engineering jobs are directly attributable to OpenStack, and that the technology has created hundreds of jobs throughout the industry. “Right now, trying to find someone with OpenStack experience, especially in San Francisco, is nearly impossible, because demand is so high.”

The technology is currently generating hundreds of millions in revenue: Rackspace’s public cloud alone— which largely relies on OpenStack—currently takes in $150 million a year. Curry, Eshagh, and O’Brien all predict that the software will be its own billion-dollar industry within a few years.

Because OpenStack is open source, and is modified and improved by the people who use it, it is more likely to remain a cutting-edge solution for cloud computing needs. Says Eshagh, “We are starting to see the heavyweights in the industry adding services on top of OpenStack—which they can do because they have a common framework to build from. That means we’ll see even more services and products being created.”

In 2012, Rackspace took steps to secure OpenStack’s future as a free and open source project: the company began the process of spinning off the platform into its own nonprofit organization. By separating itself from any one commercial interest, Curry says, the project will be better positioned to continue doing what its founders hoped it would.

O’Brien maintains that OpenStack’s potential is far from being realized. “It’s hard to characterize in advance. If you had asked an expert about Linux years ago, who could have predicted that it would be in nearly everything, as it is today? It’s in phones and mobile devices. It’s in 75 percent of deployed servers. It’s even used to support space missions. OpenStack has a chance to hit something similar to that in cloud computing.”

Curry agrees: “In the future, you can envision almost all computing being done in the cloud, much of which could be powered by OpenStack. I think that NASA will need to receive significant credit for that in the history books. What we’ve been able to do is unbelievable— especially when you remember that it all started in a NASA lab.”

[Source]

The Center for the Advancement of Science in Space is increasingly green-lighting research projects for the International Space Station.

The crystals on the left were grown in microgravity. Those on the right formed on Earth.  (NASA)

Get ready for the rodents in outer space to outnumber the humans. The Center for the Advancement of Science in Space is increasingly green-lighting research projects for the International Space Station.

The organization expects that the unique conditions of outer space could lead to research breakthroughs.

“We believe there are scientific projects that people haven’t even thought about taking gravity out of the equation, and if they realize how easy it is and how accessible it is to get to the space station they’d be all over it,” said Greg Johnson, the executive director of the Center for the Advancement of Science in Space.

The organization approved 28 projects in 2013 and expects to launch more this year. In 2011 it began managing the U.S. lab on the International Space Station for NASA.

“There are things we can learn about the planet from 250 miles we frankly just can’t learn from here,” Johnson said. “We can learn about algal blooms in oceans. We can better understand patterns in the atmosphere and how they interface with land masses and water masses.”

In zero gravity, human and animal bones degenerate, opening a door for studying osteoporosis. Prolia, a drug designed to treat postmenopausal osteoporosis, was developed using research on lab rats that were tested on the space shuttle Endeavour in 2001.

One of the current experiments taking place on the International Space Station addresses Huntington’s disease, in which proteins clump up in a patient’s brain. The surface of the proteins mutate, making it hard for researchers to analyze them. Without an accurate depiction of the protein, scientists can’t design a drug to latch onto the surface and serve as a meaningful treatment for Huntington’s disease.

Gwen Owens, a Ph.D candidate at UCLA-Caltech, is studying the Huntington’s disease protein in crystal form. She heard an NPR segment about the Center for the Advancement of Science in Space and recalled a researcher’s work using micogravity. Given that crystals grow better in space, she figured it was worth pursuing.

“The real bottleneck is getting the crystals to form,” Owens said. “Once we have the crystals, it’s not that easy, but it’s not that hard.” Owens will get a better understanding of just how valuable zero gravity proves to be when her lab gets results back in September.

[Source]

Mars Cameras Make Panoramic Photography a Snap

Aside

Originating Technology/NASA Contribution

The Gigapan robotic platform holds a digital camera.
The Gigapan robotic platform now enables photographers on Earth to capture and create super-sized digital panoramas.

If you wish to explore a Martian landscape without leaving your armchair, a few simple clicks around the NASA Web site will lead you to panoramic photographs taken from the Mars Exploration Rovers, Spirit and Opportunity. Many of the technologies that enable this spectacular Mars photography have also inspired advancements in photography here on Earth, including the panoramic camera (Pancam) and its housing assembly, designed by the Jet Propulsion Laboratory and Cornell University for the Mars missions. Mounted atop each rover, the Pancam mast assembly (PMA) can tilt a full 180 degrees and swivel 360 degrees, allowing for a complete, highly detailed view of the Martian landscape.

The rover Pancams take small, 1 megapixel (1 million pixel) digital photographs, which are stitched together into large panoramas that sometimes measure 4 by 24 megapixels. The Pancam software performs some image correction and stitching after the photographs are transmitted back to Earth. Different lens filters and a spectrometer also assist scientists in their analyses of infrared radiation from the objects in the photographs. These photographs from Mars spurred developers to begin thinking in terms of larger and higher quality images: super-sized digital pictures, or gigapixels, which are images composed of 1 billion or more pixels.

Gigapixel images are more than 200 times the size captured by today’s standard 4 megapixel digital camera. Although originally created for the Mars missions, the detail provided by these large photographs allows for many purposes, not all of which are limited to extraterrestrial photography.

Partnership

The technology behind the Mars rover PMAs inspired Randy Sargent at Ames Research Center and Illah Nourbakhsh at Carnegie Mellon University (CMU) to look at ways consumers might be able to use similar technology for more “down-to-Earth” photography and virtual exploration.

In 2005, Sargent and Nourbakhsh created the Global Connection Project, a collaboration of scientists from CMU, Google Inc., and the National Geographic Society, whose vision is to encourage better understanding of the Earth’s cultures through images. This vision inspired the development of their Gigapan products.

After seeing what the Pancams and PMAs could do, Sargent created a prototype for a consumer-version of a robotic camera platform. He worked with Rich LeGrand of Charmed Labs LLC, in Austin, Texas, to design and manufacture the Gigapan robotic platform for standard digital cameras.

Product Outcome

The Gigapan robotic platform is, in essence, an intelligent tripod that enables an amateur photographer to set up detailed shots with ease. A user sets the upper-left and lower-right corners of the panorama, and the Gigapan simply will capture as many images as the user or scene requires. With this level of automation, a 500-picture panorama is no more complicated than a 4-picture panorama; only the camera’s memory limits the size of the panorama.

The Global Connection Project also created two other Gigapan products: a Gigapan Web site and panorama stitching software born from the Ames Vision Workbench, an image processing and computer vision library developed by the Autonomous Systems and Robotics Area in the Intelligent Systems Division.

A high-resolution composite photograph shows a monk atop a temple in Nepal, the temple at a distance, and a restaurant behind the temple.
Gigapan allows a photographer to capture extremely high-resolution panoramas, which a user can explore in depth. In this wide view of Boudhanath Stupa in Kathmandu, Nepal, it is possible to zoom all the way into the smallest, barely visible points in the picture, such as the monk standing on the roof of the temple or the sign above the Tibet Kitchen Restaurant and Bar.
Gigapan panoramic image courtesy of Jessee Mayfield.

The robotic platform works with the stitching software by precisely manipulating and aligning each shot ahead of time. The Gigapan software complements the robotic platform by arranging the parts of the panorama (potentially hundreds of individual photographs) into a grid where they are stitched together into a single, very large Gigapan image.

The Global Connection Project won a 2006 “Economic Development Award” from the Tech Museum Awards for its work in creating photographic overlays for Google Earth of areas affected by natural disasters. Government workers and concerned citizens used the images on Google Earth to see which areas needed help in the aftermath of Hurricane Katrina, Hurricane Rita, and the 2005 earthquake in Kashmir.

On the Gigapan Web site, a user can display a wide bird’s eye panorama and can then zoom in with impressive bug’s eye high-quality detail. On first impression, a panoramic photograph on Gigapan’s site might seem to be simply a wide-angle cityscape of a temple in Kathmandu. With each successive click, however, the user can zoom deeper and deeper into the photo, revealing more and more clear details: a monk hanging prayer flags on the roof of the temple and the Tibet Kitchen Restaurant and Bar a few blocks behind the temple, with a sign extolling passersby to taste their gourmet food.

As part of a continuing effort to connect people and cultures, the Global Connection Project encourages all users to upload their own panoramas from around the world on the Gigapan site. Users can explore such varied landscapes as a temple in Nepal, the Burning Man festival in the Nevada desert, a market in Guatemala, or the Boston skyline from the Charles River. Because of the much greater number of pixels, the resolution is unprecedented; the Gigapan software and robotic platforms can theoretically produce prints on 40-foot-wide paper without any loss in quality.

Whether or not photographers use the Gigapan mounts and software, anyone can upload their panoramas to the Gigapan Web site. Many users of Gigapan have uploaded standard panorama photographs, as well (although the site suggests photographs be at least 50 megabytes). This is just fine with the Gigapan and the Global Connection Project coordinators, whose aim is simply to encourage exploration and understanding of the various cultures in our world.

The Fine Family Foundation is sponsoring work with the Global Connection Project to enable botanists, geologists, archeologists, and other scientists around the world to document different aspects of the Earth’s cultures and ecosystems using Gigapan technology. Scientists are using Gigapan to document life in the upper redwood forest canopy in California, volcanoes in Hawaii, and glaciers in Norway.

There are also educational uses for the Gigapan: The Pennsylvania Board of Tourism uses Gigapan for Web site visitors wanting to explore Civil War sites virtually. Also, in collaboration with the United Nations Educational, Scientific and Cultural Organization (UNESCO), the Global Connection Project has distributed Gigapan to students in Pittsburgh, South Africa, and the Republic of Trinidad and Tobago, encouraging them to photograph their local culture and share those panoramas with the world. “The hope is that students will be able to have deeper connections to other cultures,” said Sargent.

A time-lapse Gigapan robotic mount is now in development, and a professional unit for larger SLR-style cameras may be released before the end of 2008.

Gigapan is a trademark of Carnegie Mellon University.

‘NASA Invention of the Year’ Controls Noise and Vibration

Originating Technology/NASA Contribution

Developed at NASA’s Langley Research Center, the Macro-Fiber Composite (MFC) is an innovative, low-cost piezoelectric device designed for controlling vibration, noise, and deflections in composite structural beams and panels. It was created for use on helicopter blades and airplane wings as well as for the shaping of aerospace structures at NASA.

The MFC is an actuator in the form of a thin patch, almost like a 3- by 2-inch bandage comprised of piezoelectric fibers, an epoxy matrix, and polyimide electrodes, and is also called a piezocomposite. If one applies a voltage to the MFC it will stretch, and if attached to a structure it will cause the surface to bend. The major advantages of a piezofiber composite actuator are higher performance, flexibility, and durability, compared to a traditional piezoceramic actuator.

MFCs consist of rectangular piezoceramic rods sandwiched between layers of adhesive film containing tiny electrodes that transfer a voltage directly to and from ribbon-shaped rods that are no thicker than a few tenths of a millimeter. These miniscule actuators are roughly equivalent to human muscles—flexing, stretching, and returning to their original position when electricity is applied.

Because any external mechanical deformation of an MFC package produces a charge on the electrodes proportional to the deflection, its compression or stretching also enables the MFC to be used as a self-powered sensor. Defects within structures can therefore be detected, or small amounts of energy can be collected and stored for later use.

Macro-Fiber Composite
The Macro-Fiber Composite’s flat profile and use as a sensor and an actuator allows for use in critical or tight areas where other technologies with larger volumetric profiles cannot be used.

The MFC’s combination of small size, durability, flexibility, and versatility allows it to be integrated—along with highly efficient electronic control systems—into a wide range of products. Potential applications include sonar; range-measuring and fish-finding equipment; directional-force and fingerprint sensors; flow meters; and vibration/noise control in aircraft and automobiles. Since its original development in 1999, the MFC has been used in government and industrial applications ranging from vibration reduction to structural health monitoring.

NASA has used MFC piezocomposites for alleviating tail buffeting in aircraft, controlling unsteady aerodynamics and noise on helicopter rotor blades, and actively reducing vibrations in large deployable spacecraft structures. The MFC has been used as a sensor for impedance-based health monitoring of launch tower structures at NASA’s Kennedy Space Center, for strain feedback sensing and control in industrial arc welding equipment, in an STS-123 experiment, in solar sail technology, and in ultra-lightweight inflatable structures.

The MFC has been internationally recognized for its innovative design, receiving two prestigious “R&D 100” awards in 2000, including the “R&D Editor’s Choice” award as one of the 100 most significant technical products of the year. The MFC was also the recipient of the International Forum’s prestigious “iF Gold” award, in Germany, for design excellence in 2004. In March 2007, the MFC was awarded the title of “NASA Invention of the Year.”

Partnership

Smart Material Corporation, of Sarasota, Florida, specializes in the development of piezocomposite components. The company licensed the MFC technology from Langley in 2002, and then added it to their line of commercially produced actuators.

It now combines the Langley MFC’s piezoelectric properties with the robustness and conformability of plastics to radically extend the spectrum of commercial applications.

A NASA partnership gives a small company access to research and technologies that allow it to compete with larger corporations. According to Thomas Daue, with Smart Material Corporation, “For a small business, it is almost not possible anymore to spend the money for basic research in high tech products. Licensing technology from the leading research facilities in this country is a very cost effective way to become a player in a new technology field. Many developments ready for licensing at government-owned facilities have already reached the proof of concept status, which would often cost a small business or start-up millions of dollars.”

Smart Material Corporation is now marketing MFCs internationally, with the majority of applications in the United States directed at Federal government research projects or defense-related government contracts. For example, Smart Material Corporation currently sells the materials to Langley, NASA’s Jet Propulsion Laboratory, and Marshall Space Flight Center, where they are used as strain gauge sensors, as well as to the U.S. Air Force and the U.S. Army.

Product Outcome

Macro-Fiber Composite flexing
When compared to standard piezoelectric systems, the MFC is much more durable and provides increased unidirectional control. Furthermore, the MFC is designed to be readily integrated into a system as an add-on component or integrated during manufacture.

To date, Smart Material Corporation has sold MFCs to over 120 customers, including such industry giants as Volkswagen, Toyota, Honda, BMW, General Electric, and the tennis company, HEAD. The company also estimates that its customers have filed at least 100 patents for their unique uses of the technology.

Smart Material Corporation’s main manufacturing facilities are located in Dresden, Germany, in the vicinity of the Fraunhofer Institute for Ceramic Technologies and Sintered Materials, one of the world’s leading research institutes in the field of advanced ceramics. Dresden is also the center of the German semiconductor industry, and so provides crucial interdisciplinary resources for further MFC refinement. In addition to its Sarasota facility, the company also has sales offices in Dresden and in Tokyo, Japan.

The company’s product portfolio has grown to include piezoceramic fibers and fiber composites, piezoceramic actuators and sensors, and test equipment for these products. It also offers a compact, lightweight power system for MFC testing and validation.

Smart Material Corporation believes that solid-state actuator systems, including piezoceramics, will have healthy commercial growth in the coming years, with increasing penetration in industrial, medical, automotive, defense, and consumer markets. Consumer applications already on the market include piezoelectric systems as part of audio speakers, phonograph cartridges and microphones, and recreational products requiring vibration control, such as skis, snowboards, baseball bats, hockey sticks, and tennis racquets.

How NASA tech makes an impact in your daily life

NASA suffers from an interesting problem: NASA gets credit for things it didn’t do and doesn’t get credit for things it did do. The public knows that the investment in space and space technologies brings about innovations that improve our daily lives. An understanding of what those technologies are, however, is something that is often elusive. NASA is often mistakenly credited with inventing commonplace consumer products to which it had either tangential connections or no connections—certainly not an enabling connection. Meanwhile, the real stories of NASA’s technological achievements are often unknown.

 

This is an issue that has nagged at NASA since the Apollo program. Prior to the success of the Apollo program, for many, travel to other celestial bodies and the associated space technologies were dreams of the future, but with the successful Moon landings, there came the realization that these cutting-edge technologies were things of the present. This generated a keen interest in the public and an expectation that, since we were now living in a “space age,” that these technologies developed for space should reach homes and factories across the country.

 

This era, the middle portion of the 20th century, was also a period when many new technologies were already reaching the public, spurred by advances in manufacturing and electronics. And while this influx of new consumer electronics and gadgets happened during a time when people were discovering the possibilities of space flight, many of the new goods were not directly related to any space or NASA mission. As a result, to this day, people (sometimes employees of NASA included) often mistake common household goods like microwave ovens, quartz wristwatches, smoke detectors, and barcodes for NASA technologies.  While the Apollo program did bring about many significant spinoff technologies—like some of the first practical uses of the integrated circuit, the predecessor of the modern microchip—the difference between recorded spinoff technologies and public perception is pronounced.

 

The belief that NASA technologies have direct benefit to our everyday lives, though, is not misplaced.

 

The benefits of NASA technology are all around us. Among those that have had the greatest impact are:

  • A cardiac pump that functions as a “bridge to transplant” for patients and which has saved hundreds of lives.
  • The cameras in many cell phones.

  • Memory foam, a material found in everything from mattresses to sports helmets.

  • Aerodynamics advances that have been widely implemented in truck designs—today nearly all trucks on the road incorporate NASA technology.

  

 

  • Liquidmetal alloys that are used in everything from sports equipment to computers and mobile devices.

 

These and many other products have all benefitted from the Nation’s investments in aerospace technology. The list goes on.

 

A recent analysis of companies who have recently commercialized NASA technology shows impressive results: billions of dollars in generated revenue, billions in cost savings, tens of thousands of jobs created and tens of thousands of lives saved.

 

NASA is committed to moving technologies and innovations into the mainstream of the U.S. economy, and we actively seek partnerships with U.S. companies that can license NASA innovations and create spinoffs in areas such as health and medicine, consumer goods, transportation, renewable energy and manufacturing.

 

NASA is also committed to telling this story and making sure both that the public is aware of the benefits of its investment in space technology, but also that American industry is aware of the availability of NASA technology research and assistance through its Technology Transfer Program.

 

Just this month, NASA released its newest edition of Spinoff. A long-standing NASA tradition, this annual report highlights some of the many advances that have come out of NASA’s Technology Transfer Program. Spinoffs in this year’s book alone include:

International Space Station

  • An invisible coating, developed by a NASA Dual-Use Technology partner and tested at NASA facilities, that is capable of breaking down pollutants, eliminating odors, and inhibiting the buildup of grime. The technology’s many applications include enhancing the efficiency of solar cells, sanitizing air in the homes of those suffering from cystic fibrosis, and even transforming buildings and towering modern art sculptures into massive air purifiers.

 

QC Bot traveling down a hospital hallway

  • A robot assistant now found in the halls of hospitals around the country, helping with everything from registering patients to logging vital signs. The robot has been dubbed “a Mars rover in a hospital” by one of its developers, who employed the expertise he gained working on Mars robotics for NASA to create the technology. The robot is not only easing the workload of hospital staff but also providing an economic return, creating 20 new jobs for its manufacturer.

 

 

 

Two Cricket Trailers at the beach and Interior

  • A recreational trailer designed using the same principles that supplied comfortable living quarters for the crew of the International Space Station. The trailer’s creator used his experience as a NASA architect to create a unique, eco-friendly means for reconnecting with nature and revitalizing interest in our Nation’s parks.

 

 

 

Deep Space 1 spacecraft

  • A solar concentration technology that, for the same amount of silicon, can provide many times the power of conventional panels benefited from innovations developed through a NASA Small Business Innovation Research (SBIR) partnership. The company founded to commercialize these NASA-derived sustainable energy installations now employs 30 workers, all with a mission to move renewable solar power into true mainstream use.

 

406 MHz personal locator beacon

  • A worldwide search and rescue system that was founded through NASA innovation. Enabled in part by satellite ground stations developed and constructed by a NASA partner, the true value of this spinoff is inestimable. To date, more than 30,000 lives have been saved, on average more than 6 a day, from the highly publicized 2010 rescue of teen sailor Abby Sunderland to the rescue of fishermen, hikers, and adventurers around the world.

 

 

 

 

The Spinoff report is available online at http://spinoff.nasa.gov, where you will also find a searchable database of the over 1,800 spinoffs NASA has recorded since it began the Spinoff report in 1976.

 

Space Age Swimsuit Reduces Drag, Breaks Records

Originating Technology/NASA Contribution

An athlete swims toward the camera.
NASA helped Speedo reduce viscous drag in the new LZR Racer by performing surface drag testing and applying expertise in the area of fluid dynamics.

A space shuttle and a competitive swimmer have a lot more in common than people might realize: Among other forces, both have to contend with the slowing influence of drag. NASA’s Aeronautics Research Mission Directorate focuses primarily on improving flight efficiency and generally on fluid dynamics, especially the forces of pressure and viscous drag, which are the same for bodies moving through air as for bodies moving through water. Viscous drag is the force of friction that slows down a moving object through a substance, like air or water.

NASA uses wind tunnels for fluid dynamics research, studying the forces of friction in gasses and liquids. Pressure forces, according to Langley Research Center’s Stephen Wilkinson, “dictate the optimal shape and performance of an airplane or other aero/hydro-dynamic body.” In both high-speed flight and swimming, says Wilkinson, a thin boundary layer of reduced velocity fluid surrounds the moving body; this layer is about 2 centimeters thick for a swimmer.

Partnership

Key areas of compression in the LZR Racer swimsuit
The LZR Racer provides extra compression in key areas to help a swimmer use less energy to swim more quickly.

In spite of some initial skepticism, Los Angeles-based SpeedoUSA asked NASA to help design a swimsuit with reduced drag, shortly after the 2004 Olympics. According to Stuart Isaac, senior vice president of Team Sales and Sports Marketing, “People would look at us and say ‘this isn’t rocket science’ and we began to think, ‘well, actually, maybe it is.’” While most people would not associate space travel with swimwear, rocket science is exactly what SpeedoUSA decided to try. The manufacturer sought a partnership with NASA because of the Agency’s expertise in the field of fluid dynamics and in the area of combating drag.

A 2004 computational fluid dynamics study conducted by Speedo’s Aqualab research and development unit determined that the viscous drag on a swimmer is about 25 percent of the total retarding force. In competitive swimming, where every hundredth of a second counts, the best possible reduction in drag is crucially important. Researchers began flat plate testing of fabrics, using a small wind tunnel developed for earlier research on low-speed viscous drag reduction, and Wilkinson collaborated over the next few years with Speedo’s Aqualab to design what Speedo now considers the most efficient swimsuit yet: the LZR Racer. Surface drag testing was performed with the help of Langley, and additional water flume testing and computational fluid dynamics were performed with guidance from the University of Otago (New Zealand) and ANSYS Inc., a computer-aided engineering firm.

“Speedo had the materials in mind [for the LZR Racer],” explains Isaac, “but we did not know how they would perform in surface friction drag testing, which is where we enlisted the help of NASA.” The manufacturer says the fabric, which Speedo calls LZR Pulse, is not only efficient at reducing drag, but it also repels water and is extremely lightweight. Speedo tested about 100 materials and material coatings before settling on LZR Pulse.

NASA and Speedo performed tests on traditionally sewn seams, ultrasonically welded seams, and the fabric alone, which gave Speedo a baseline for reducing drag caused by seams and helped them identify problem areas. NASA wind tunnel results helped Speedo “create a bonding system that eliminates seams and reduces drag,” according to Isaac. The Speedo LZR Racer is the first fully bonded, full-body swimsuit with ultrasonically welded seams. Instead of sewing overlapping pieces of fabric together, Speedo actually fused the edges ultrasonically, reducing drag by 6 percent. “The ultrasonically welded seams have just slightly more drag than the fabric alone,” Isaac explains. NASA results also showed that a low-profile zipper ultrasonically bonded (not sewn) into the fabric and hidden inside the suit generated 8 percent less drag in wind tunnel tests than a standard zipper. Low-profile seams and zippers were a crucial component in the LZR Racer because the suit consists of multiple connecting fabric pieces—instead of just a few sewn pieces such as found in traditional suits—that provide extra compression for maximum efficiency.

Product Outcome

LZR Racer swimsuit covering the torso and legs of a swimmer

The LZR Racer reduces skin friction drag by covering more skin than traditional swimsuits. Multiple pieces of the water-resistant and extremely lightweight LZR Pulse fabric connect at ultrasonically welded seams and incorporate extremely low-profile zippers to keep viscous drag to a minimum. 

The LZR Racer reduces skin friction drag 24 percent more than the Fastskin, the previous Speedo racing suit fabric; and according to the manufacturer, the LZR Racer uses a Hydro Form Compression System to grip the body like a corset. Speedo experts say this compression helps the swimmers maintain the best form possible and enables them to swim longer and faster since they are using less energy to maintain form. The compression alone improves efficiency up to 5 percent, according to the manufacturer.

Olympic swimmer Katie Hoff, one of the American athletes wearing the suit in 2008 competitions, said that the tight suit helps a swimmer move more quickly through the water, because it “compresses [the] whole body so that [it’s] really streamlined.” Athletes from the French, Australian, and British Olympic teams all participated in testing the new Speedo racing suits.

Similar in style to a wetsuit, the LZR Racer can cover all or part of the legs, depending on personal preference and event. A swimmer can choose a full-body suit that covers the entire torso and extends to the ankles, or can opt for a suit with shorter legs above the knees. The more skin the LZR Racer covers, the more potential it has to reduce skin friction drag. The research seems to have paid off; in March 2008, athletes wearing the LZR Racer broke 13 world records.

Speedo®, LZR Pulse®, LZR Racer®, and FastSkin® are registered trademarks of Speedo Holdings B.V.

[Source]