Why Should We Spend Money on the NASA Space Program?

NASA budget grapghThe point to spending on the space program, for example, is shown by the timeline of spinoff technologies [1]. I’m preparing the fiscal numbers using the information posted on this blog to look into asking, “What is the financial effect of science and technology funding?”, so the Apollo program is a perfect example for looking into this question. As discussed in the timeline, Apollo led to “cool suits alleviate dangers from high-heat environments and medical conditions, Kidney dialysis machines remove toxic waste from used dialysis fluid, A machine aids physical therapy and athletic development, A stress-free ‘blow molding’ process manufactures athletic shoes, Communities benefit from water purification technology, Manufacturers preserve food through freeze-drying, and Sensors detect hazardous gasses.”

In today’s money, the $20.4 billion spent on the Apollo program is equivalent to $109 billion [2]. If we look at the markets that exist now because of these technologies (attempt to estimate the revenue gained if the technology did not now exist), then the dialysis market brings in $16 billion A YEAR (+, more than that due to secondary effects from increased quality of life) [3], the sports coaching market (which would benefit from a “machine aids athletic development”) brings in $6 billion a year (-, less because an athletic development machine would only lead to changes in part of the market) [4], the physical therapy market brings in $30 billion a year (-, less is attributable to the technology) [5], the athletic shoe market brings in $75 billion a year (-, less) [6], the water purification market brings in $20 billion a year (+, more from secondary effects) [7], and the hazardous gas detection market is $2 billion a year (+, more) [8]. Note that I have not been able to assemble fiscal numbers on all of the technologies listed above because of time constraints to do all the research. Also, the spinoff timeline did not mention that the Apollo program led to the development of the expendable launch vehicle market which is $53 billion over the next 10 years [9].

So finally, we can get our hands on a comparison. $20.4 billion spent in 1970 on the space program translates to somewhere around $154.3 billion dollars _A YEAR_ in technology-based markets. What this roughly indicates is that the cuts in NASA funding is the farming equivalent to eating your seed corn.

[References]

[1] http://spinoff.nasa.gov/Spinoff2008/pdf/timeline_08.pdf

[2] http://en.wikipedia.org/wiki/Apollo_program

[3] http://www.firstresearch.com/Industry-Research/Kidney-Dialysis-Centers.html

[4] http://www.ibisworld.com/industry/default.aspx?indid=1542

[5] http://www.ibisworld.com/industry/default.aspx?indid=1562

[6] http://www.prweb.com/releases/2013/6/prweb10808367.htm

[7] http://www.hkc22.com/waterpurification.html

[8] http://www.prnewswire.com/news-releases/global-gas-sensors-detectors-and-analyzers-markets-190181871.html

[9] http://www.militaryaerospace.com/articles/2011/10/expendable-launch.html

Simulation Packages Expand Aircraft Design Options

NASA Technology

When engineers explore designs for safer, more fuel efficient, or faster aircraft, they encounter a common problem: they never know exactly what will happen until the vehicle gets off the ground.

“You will never get the complete answer until you build the airplane and fly it,” says Colin Johnson of Desktop Aeronautics. “There are multiple levels of simulation you can do to approximate the vehicle’s performance, however.”

altWhen designing a new air vehicle, computational fluid dynamics, or CFD, comes in very handy for engineers. CFD can predict the flow of fluids and gasses around an object—such as over an aircraft’s wing—by running complex calculations of the fluid physics. This information is helpful in assessing the aircraft’s aerodynamic performance and handling characteristics.

In 2001, after several years of development, NASA released a new approach to CFD called Cart3D. The tool provides designers with an automated, highly accurate computer simulation suite to streamline the conceptual analysis of aerospace vehicles. Specifically, it allows users to perform automated CFD analysis on complex vehicle designs. In 2002, the innovation won NASA’s Software of the Year award.

Michael Aftosmis, one of the developers of Cart3D and a fluid mechanics engineer at Ames Research Center, says the main purpose of the program was to remove the mesh generation bottleneck from CFD. A major benefit of Cart3D is that the mesh, or the grid for analyzing designs, is produced automatically. Traditionally, the mesh has been generated by hand, and requires months or years to produce for complex vehicle configurations. Cart3D’s automated volume mesh generation enables even the most complex geometries to be modeled hundreds of times faster, usually within seconds. “It allows a novice user to get the same quality results as an expert,” says Aftosmis.

Now, a decade later, NASA continues to enhance Cart3D to meet users’ needs for speed, power, and flexibility. Cart3D provides the best of both worlds—the payoff of using a complex, high-fidelity simulation with the ease of use and speed of a much simpler, lower-fidelity simulation method. Aftosmis explains how instead of simulating just one case, Cart3D’s ease of use and automation allows a user to efficiently simulate many cases to understand how a vehicle behaves for a range of conditions. “Cart3D is the first tool that was able to do that successfully,” he says.

At NASA, Aftosmis estimates that 300–400 engineers use the package. “We use it for space vehicle design, supersonic aircraft design, and subsonic aircraft design.”

Technology Transfer

To enable more use of Cart3D for private and commercial aviation entities, the Small Business Innovation Research (SBIR) program at Langley Research Center provided funds to Desktop Aeronautics, based in Palo Alto, California, to build a plug-in to Cart3D that increases the code’s accuracy under particular flow conditions. Aftosmis says Desktop Aeronautics delivered valuable results and made Cart3D more applicable for general use. “Now they are bringing the product to market. This is something we never would have had the time to do at NASA. That’s the way the SBIR process is supposed to work.”

In 2010, Desktop Aeronautics acquired a license from Ames to sell Cart3D. The company further enhanced the software by making it cross-platform, incorporated a graphical user interface, and added specialized features to enable extra computation for the analysis of airplanes with engines and exhaust.

“I think it’s going to be game-changing for CFD,” says Aftosmis. “Cart3D is the only commercial simulation tool that can guarantee the accuracy of every solution the user does.”


Benefits

altToday, Desktop Aeronautics employs Cart3D in its consulting services and licenses the spinoff product to clients for in-house use. The company provides commercial licenses and academic licenses for research and development projects.

The software package allows users to perform automated CFD analysis on complex designs and, according to the company, enables geometry acquisition and mesh generation to be performed within a few minutes on most desktop computers.

Simulations generated by Cart3D are assisting organizations in the design of subsonic aircraft, space planes, spacecraft, and high speed commercial jets. Customers are able to simulate the efficiency of designs through performance metrics such as lift-to-drag ratio.

“It will assemble a spectrum of solutions for many different cases, and from that spectrum, the cases that perform best give insight into how to improve one’s design,” says Johnson. “Cart3D’s preeminent benefit is that it’s automated and can handle complex geometry. It’s blazing fast. You push a button, and it takes care of the volume meshing and flow measurement.”

Without building an aircraft, engineers can never be completely certain which design concept will perform best in flight. However, they now have a tool to make the most informed prediction possible.

[Source]

Web Solutions Inspire Cloud Computing Software

NASA Technology

In 2008, a NASA effort to standardize its websites inspired a breakthrough in cloud computing technology. The innovation has spurred the growth of an entire industry in open source cloud services that has already attracted millions in investment and is currently generating hundreds of millions in revenue.

William Eshagh was part of the project in the early days, when it was known as NASA.net. “The feeling was that there was a proliferation of NASA websites and approaches to building them. Everything looked different, and it was all managed differently—it was a fragmented landscape.”

altNASA.net aimed to resolve this problem by providing a standard set of tools and methods for web developers. The developers, in turn, would provide design, code, and functionality for their project while adopting and incorporating NASA’s standardized approach. Says Eshagh, “The basic idea was that the web developer would write their code and upload it to the website, and the website would take care of everything else.”

altEven though the project was relatively narrow in its focus, the developers soon realized that they would need bigger, more foundational tools to accomplish the job. “We were trying to create a ‘platform layer,’ which is the concept of giving your code over to the service. But in order to build that, we actually needed to go a step deeper,” says Eshagh.

That next step was to create an “infrastructure service.” Whereas NASA.net was a platform for dealing with one type of application, an infrastructure service is a more general tool with a simpler purpose: to provide access to computing power. While such computing power could be used to run a service like NASA.net, it could also be used for other applications.

Put another way, what the team came to realize was that they needed to create a cloud computing service. Cloud computing is the delivery of software, processing power, and storage over the Internet. Whether these resources are as ordinary as a library of music files or as complex as a network of supercomputers, cloud computing enables an end user to control them remotely and simply. “The idea is to be able to log on to the service and say ‘I want 10 computers,’ and within a minute, I can start using those computers for any purpose whatsoever,” says Eshagh.

As the scope of the project expanded, NASA.net came to be known as Nebula. Much more than setting standards for Agency web developers, Nebula was intended to provide NASA developers, researchers, and scientists with a wide range of services for accessing and managing the large quantities of data the Agency accumulates every day. This was an enormous undertaking that only a high-powered cloud computing platform could provide.

Raymond O’Brien, former program manager of Nebula, says the project was in some ways ahead of its time. “Back in 2008 and 2009, people were still trying to figure out what ‘cloud’ meant. While lots of people were calling themselves ‘cloud enabled’ or ‘cloud ready,’ there were few real commercial offerings. With so little clarity on the issue, there was an opportunity for us to help fill that vacuum.”


altAs the team built Nebula, one of the most pressing questions they faced was that of open source development, or the practice of building software in full view of the public over the Internet.

On the one hand, proprietary code might have helped the project overcome early hurdles, as commercial software can offer off-the-shelf solutions that speed up development by solving common problems. Proprietary software is sometimes so useful and convenient that the Nebula team wasn’t even sure that they could create the product without relying on closed source solutions at some point.

On the other hand, open source development would facilitate a collaborative environment without borders—literally anyone with the know-how and interest could access the code and improve on it. Because Nebula had evolved into a project that was addressing very general, widespread needs—not just NASA-wide, but potentially worldwide—the possibility of avoiding restrictive licensing agreements by going open source was very attractive.

O’Brien says that broad appeal was an important part of Nebula’s identity. “From the beginning, we wanted this project to involve a very large community—private enterprises, academic institutions, research labs—that would take Nebula and bring it to the next level. It was a dream, a vision. It was that way from the start.”

Despite uncertainties, the development team decided to make Nebula purely open source. Eshagh says the real test for that philosophy came when those constraints were stretched to their limits. “Eventually, we determined that existing open source tools did not fully address Nebula’s requirements,” he says. “But instead of turning to proprietary tools, we decided to write our own.”

The problem was with a component of the software called the cloud controller, or the tool that can turn a single server or pool of servers into many virtual servers, which can then be provisioned remotely using software. In effect, the controller gives an end user access in principle to as much or as little computing power and storage as is needed. Existing tools were either written in the wrong programming language or under the wrong software license.

Within a matter of days, the Nebula team had built a new cloud controller from scratch, in Python (their preferred programming language for the controller), and under an open source license. When the team announced this breakthrough on its blog, they immediately began attracting attention from some of the biggest players in the industry. “We believed we were addressing a general problem that would have broad interest,” says Eshagh. “As it turns out, that prediction couldn’t have been more accurate.”

Technology Transfer

Rackspace Inc., of San Antonio, Texas, was one of the companies most interested in the technology. Rackspace runs the second largest public cloud in the world and was at the time offering computing and storage services using software they had created in-house. Jim Curry, general manager of Rackspace Cloud Builders, says they faced hurdles similar to those NASA faced in building a private cloud. “We tried to use available technology,” he says, “but it couldn’t scale up to meet our needs.”

The engineers at Rackspace wrote their own code for a number of years, but Curry says they didn’t see it as a sustainable activity. “We’re a hosting company—people come to us when they want to run standard server environments with a high level of hosting support that we can offer them. Writing proprietary code for unique technologies is not something we wanted to be doing long-term.”

The developers at Rackspace were fans of open source development and had been looking into open source solutions right at the time the Nebula team announced its new cloud controller. “Just weeks before we were going to announce our own open source project, we saw that what NASA had released looked very similar to what we were trying to do.” Curry reached out to the Nebula team, and within a week the two development teams met and agreed that it made sense to collaborate on the project going forward.

Each of the teams brought something to the table, says Curry. “The nice thing about it was that we were more advanced than NASA in some areas and vice versa, and we each complemented the other very well. For example, NASA was further along with their cloud controller, whereas we were further along on the storage side of things.”

The next step was for each organization to make its code open source so the two teams could launch the project as an independent, open entity. Jim Curry says the team at Rackspace was stunned by the speed at which NASA moved through the process. “Within a period of 30–45 days, NASA completed the process of getting the agreements to have this stuff done. From my perspective, they moved as fast as any company I’ve ever worked with, and it was really impressive to watch.”

The OpenStack project, the successor to Nebula with development from Rackspace, was announced in July 2010. As open source software, OpenStack has attracted a very broad community: nearly 2,500 independent developers and 150 companies are a part of it—including such giants as AT&T, HP, Cisco, Dell, and Intel. Semi-annual developers’ conferences, where members of the development community meet to exchange ideas and explore new directions for the software, now attract over 1,000 participants from about two dozen different countries.

Benefits

Because OpenStack is free, companies who use it to deploy servers do not need to pay licensing fees—fees that can easily total thousands of dollars per server per year. With the number of companies that have already adopted OpenStack, the software has potentially saved millions of dollars in server costs.

“Before OpenStack,” says Curry, “your only option was to pay someone money to solve the problem that OpenStack is addressing today. For people who want it as a solution, who like the idea of consuming open source, they now have an alternative to proprietary options.”

Not only is OpenStack saving money; it is also generating jobs and revenue at a remarkable pace. Curry says that dozens of Rackspace’s 80 cloud engineering jobs are directly attributable to OpenStack, and that the technology has created hundreds of jobs throughout the industry. “Right now, trying to find someone with OpenStack experience, especially in San Francisco, is nearly impossible, because demand is so high.”

The technology is currently generating hundreds of millions in revenue: Rackspace’s public cloud alone— which largely relies on OpenStack—currently takes in $150 million a year. Curry, Eshagh, and O’Brien all predict that the software will be its own billion-dollar industry within a few years.

Because OpenStack is open source, and is modified and improved by the people who use it, it is more likely to remain a cutting-edge solution for cloud computing needs. Says Eshagh, “We are starting to see the heavyweights in the industry adding services on top of OpenStack—which they can do because they have a common framework to build from. That means we’ll see even more services and products being created.”

In 2012, Rackspace took steps to secure OpenStack’s future as a free and open source project: the company began the process of spinning off the platform into its own nonprofit organization. By separating itself from any one commercial interest, Curry says, the project will be better positioned to continue doing what its founders hoped it would.

O’Brien maintains that OpenStack’s potential is far from being realized. “It’s hard to characterize in advance. If you had asked an expert about Linux years ago, who could have predicted that it would be in nearly everything, as it is today? It’s in phones and mobile devices. It’s in 75 percent of deployed servers. It’s even used to support space missions. OpenStack has a chance to hit something similar to that in cloud computing.”

Curry agrees: “In the future, you can envision almost all computing being done in the cloud, much of which could be powered by OpenStack. I think that NASA will need to receive significant credit for that in the history books. What we’ve been able to do is unbelievable— especially when you remember that it all started in a NASA lab.”

[Source]

The Center for the Advancement of Science in Space is increasingly green-lighting research projects for the International Space Station.

The crystals on the left were grown in microgravity. Those on the right formed on Earth.  (NASA)

Get ready for the rodents in outer space to outnumber the humans. The Center for the Advancement of Science in Space is increasingly green-lighting research projects for the International Space Station.

The organization expects that the unique conditions of outer space could lead to research breakthroughs.

“We believe there are scientific projects that people haven’t even thought about taking gravity out of the equation, and if they realize how easy it is and how accessible it is to get to the space station they’d be all over it,” said Greg Johnson, the executive director of the Center for the Advancement of Science in Space.

The organization approved 28 projects in 2013 and expects to launch more this year. In 2011 it began managing the U.S. lab on the International Space Station for NASA.

“There are things we can learn about the planet from 250 miles we frankly just can’t learn from here,” Johnson said. “We can learn about algal blooms in oceans. We can better understand patterns in the atmosphere and how they interface with land masses and water masses.”

In zero gravity, human and animal bones degenerate, opening a door for studying osteoporosis. Prolia, a drug designed to treat postmenopausal osteoporosis, was developed using research on lab rats that were tested on the space shuttle Endeavour in 2001.

One of the current experiments taking place on the International Space Station addresses Huntington’s disease, in which proteins clump up in a patient’s brain. The surface of the proteins mutate, making it hard for researchers to analyze them. Without an accurate depiction of the protein, scientists can’t design a drug to latch onto the surface and serve as a meaningful treatment for Huntington’s disease.

Gwen Owens, a Ph.D candidate at UCLA-Caltech, is studying the Huntington’s disease protein in crystal form. She heard an NPR segment about the Center for the Advancement of Science in Space and recalled a researcher’s work using micogravity. Given that crystals grow better in space, she figured it was worth pursuing.

“The real bottleneck is getting the crystals to form,” Owens said. “Once we have the crystals, it’s not that easy, but it’s not that hard.” Owens will get a better understanding of just how valuable zero gravity proves to be when her lab gets results back in September.

[Source]

Mars Cameras Make Panoramic Photography a Snap

Aside

Originating Technology/NASA Contribution

The Gigapan robotic platform holds a digital camera.
The Gigapan robotic platform now enables photographers on Earth to capture and create super-sized digital panoramas.

If you wish to explore a Martian landscape without leaving your armchair, a few simple clicks around the NASA Web site will lead you to panoramic photographs taken from the Mars Exploration Rovers, Spirit and Opportunity. Many of the technologies that enable this spectacular Mars photography have also inspired advancements in photography here on Earth, including the panoramic camera (Pancam) and its housing assembly, designed by the Jet Propulsion Laboratory and Cornell University for the Mars missions. Mounted atop each rover, the Pancam mast assembly (PMA) can tilt a full 180 degrees and swivel 360 degrees, allowing for a complete, highly detailed view of the Martian landscape.

The rover Pancams take small, 1 megapixel (1 million pixel) digital photographs, which are stitched together into large panoramas that sometimes measure 4 by 24 megapixels. The Pancam software performs some image correction and stitching after the photographs are transmitted back to Earth. Different lens filters and a spectrometer also assist scientists in their analyses of infrared radiation from the objects in the photographs. These photographs from Mars spurred developers to begin thinking in terms of larger and higher quality images: super-sized digital pictures, or gigapixels, which are images composed of 1 billion or more pixels.

Gigapixel images are more than 200 times the size captured by today’s standard 4 megapixel digital camera. Although originally created for the Mars missions, the detail provided by these large photographs allows for many purposes, not all of which are limited to extraterrestrial photography.

Partnership

The technology behind the Mars rover PMAs inspired Randy Sargent at Ames Research Center and Illah Nourbakhsh at Carnegie Mellon University (CMU) to look at ways consumers might be able to use similar technology for more “down-to-Earth” photography and virtual exploration.

In 2005, Sargent and Nourbakhsh created the Global Connection Project, a collaboration of scientists from CMU, Google Inc., and the National Geographic Society, whose vision is to encourage better understanding of the Earth’s cultures through images. This vision inspired the development of their Gigapan products.

After seeing what the Pancams and PMAs could do, Sargent created a prototype for a consumer-version of a robotic camera platform. He worked with Rich LeGrand of Charmed Labs LLC, in Austin, Texas, to design and manufacture the Gigapan robotic platform for standard digital cameras.

Product Outcome

The Gigapan robotic platform is, in essence, an intelligent tripod that enables an amateur photographer to set up detailed shots with ease. A user sets the upper-left and lower-right corners of the panorama, and the Gigapan simply will capture as many images as the user or scene requires. With this level of automation, a 500-picture panorama is no more complicated than a 4-picture panorama; only the camera’s memory limits the size of the panorama.

The Global Connection Project also created two other Gigapan products: a Gigapan Web site and panorama stitching software born from the Ames Vision Workbench, an image processing and computer vision library developed by the Autonomous Systems and Robotics Area in the Intelligent Systems Division.

A high-resolution composite photograph shows a monk atop a temple in Nepal, the temple at a distance, and a restaurant behind the temple.
Gigapan allows a photographer to capture extremely high-resolution panoramas, which a user can explore in depth. In this wide view of Boudhanath Stupa in Kathmandu, Nepal, it is possible to zoom all the way into the smallest, barely visible points in the picture, such as the monk standing on the roof of the temple or the sign above the Tibet Kitchen Restaurant and Bar.
Gigapan panoramic image courtesy of Jessee Mayfield.

The robotic platform works with the stitching software by precisely manipulating and aligning each shot ahead of time. The Gigapan software complements the robotic platform by arranging the parts of the panorama (potentially hundreds of individual photographs) into a grid where they are stitched together into a single, very large Gigapan image.

The Global Connection Project won a 2006 “Economic Development Award” from the Tech Museum Awards for its work in creating photographic overlays for Google Earth of areas affected by natural disasters. Government workers and concerned citizens used the images on Google Earth to see which areas needed help in the aftermath of Hurricane Katrina, Hurricane Rita, and the 2005 earthquake in Kashmir.

On the Gigapan Web site, a user can display a wide bird’s eye panorama and can then zoom in with impressive bug’s eye high-quality detail. On first impression, a panoramic photograph on Gigapan’s site might seem to be simply a wide-angle cityscape of a temple in Kathmandu. With each successive click, however, the user can zoom deeper and deeper into the photo, revealing more and more clear details: a monk hanging prayer flags on the roof of the temple and the Tibet Kitchen Restaurant and Bar a few blocks behind the temple, with a sign extolling passersby to taste their gourmet food.

As part of a continuing effort to connect people and cultures, the Global Connection Project encourages all users to upload their own panoramas from around the world on the Gigapan site. Users can explore such varied landscapes as a temple in Nepal, the Burning Man festival in the Nevada desert, a market in Guatemala, or the Boston skyline from the Charles River. Because of the much greater number of pixels, the resolution is unprecedented; the Gigapan software and robotic platforms can theoretically produce prints on 40-foot-wide paper without any loss in quality.

Whether or not photographers use the Gigapan mounts and software, anyone can upload their panoramas to the Gigapan Web site. Many users of Gigapan have uploaded standard panorama photographs, as well (although the site suggests photographs be at least 50 megabytes). This is just fine with the Gigapan and the Global Connection Project coordinators, whose aim is simply to encourage exploration and understanding of the various cultures in our world.

The Fine Family Foundation is sponsoring work with the Global Connection Project to enable botanists, geologists, archeologists, and other scientists around the world to document different aspects of the Earth’s cultures and ecosystems using Gigapan technology. Scientists are using Gigapan to document life in the upper redwood forest canopy in California, volcanoes in Hawaii, and glaciers in Norway.

There are also educational uses for the Gigapan: The Pennsylvania Board of Tourism uses Gigapan for Web site visitors wanting to explore Civil War sites virtually. Also, in collaboration with the United Nations Educational, Scientific and Cultural Organization (UNESCO), the Global Connection Project has distributed Gigapan to students in Pittsburgh, South Africa, and the Republic of Trinidad and Tobago, encouraging them to photograph their local culture and share those panoramas with the world. “The hope is that students will be able to have deeper connections to other cultures,” said Sargent.

A time-lapse Gigapan robotic mount is now in development, and a professional unit for larger SLR-style cameras may be released before the end of 2008.

Gigapan is a trademark of Carnegie Mellon University.

‘NASA Invention of the Year’ Controls Noise and Vibration

Originating Technology/NASA Contribution

Developed at NASA’s Langley Research Center, the Macro-Fiber Composite (MFC) is an innovative, low-cost piezoelectric device designed for controlling vibration, noise, and deflections in composite structural beams and panels. It was created for use on helicopter blades and airplane wings as well as for the shaping of aerospace structures at NASA.

The MFC is an actuator in the form of a thin patch, almost like a 3- by 2-inch bandage comprised of piezoelectric fibers, an epoxy matrix, and polyimide electrodes, and is also called a piezocomposite. If one applies a voltage to the MFC it will stretch, and if attached to a structure it will cause the surface to bend. The major advantages of a piezofiber composite actuator are higher performance, flexibility, and durability, compared to a traditional piezoceramic actuator.

MFCs consist of rectangular piezoceramic rods sandwiched between layers of adhesive film containing tiny electrodes that transfer a voltage directly to and from ribbon-shaped rods that are no thicker than a few tenths of a millimeter. These miniscule actuators are roughly equivalent to human muscles—flexing, stretching, and returning to their original position when electricity is applied.

Because any external mechanical deformation of an MFC package produces a charge on the electrodes proportional to the deflection, its compression or stretching also enables the MFC to be used as a self-powered sensor. Defects within structures can therefore be detected, or small amounts of energy can be collected and stored for later use.

Macro-Fiber Composite
The Macro-Fiber Composite’s flat profile and use as a sensor and an actuator allows for use in critical or tight areas where other technologies with larger volumetric profiles cannot be used.

The MFC’s combination of small size, durability, flexibility, and versatility allows it to be integrated—along with highly efficient electronic control systems—into a wide range of products. Potential applications include sonar; range-measuring and fish-finding equipment; directional-force and fingerprint sensors; flow meters; and vibration/noise control in aircraft and automobiles. Since its original development in 1999, the MFC has been used in government and industrial applications ranging from vibration reduction to structural health monitoring.

NASA has used MFC piezocomposites for alleviating tail buffeting in aircraft, controlling unsteady aerodynamics and noise on helicopter rotor blades, and actively reducing vibrations in large deployable spacecraft structures. The MFC has been used as a sensor for impedance-based health monitoring of launch tower structures at NASA’s Kennedy Space Center, for strain feedback sensing and control in industrial arc welding equipment, in an STS-123 experiment, in solar sail technology, and in ultra-lightweight inflatable structures.

The MFC has been internationally recognized for its innovative design, receiving two prestigious “R&D 100” awards in 2000, including the “R&D Editor’s Choice” award as one of the 100 most significant technical products of the year. The MFC was also the recipient of the International Forum’s prestigious “iF Gold” award, in Germany, for design excellence in 2004. In March 2007, the MFC was awarded the title of “NASA Invention of the Year.”

Partnership

Smart Material Corporation, of Sarasota, Florida, specializes in the development of piezocomposite components. The company licensed the MFC technology from Langley in 2002, and then added it to their line of commercially produced actuators.

It now combines the Langley MFC’s piezoelectric properties with the robustness and conformability of plastics to radically extend the spectrum of commercial applications.

A NASA partnership gives a small company access to research and technologies that allow it to compete with larger corporations. According to Thomas Daue, with Smart Material Corporation, “For a small business, it is almost not possible anymore to spend the money for basic research in high tech products. Licensing technology from the leading research facilities in this country is a very cost effective way to become a player in a new technology field. Many developments ready for licensing at government-owned facilities have already reached the proof of concept status, which would often cost a small business or start-up millions of dollars.”

Smart Material Corporation is now marketing MFCs internationally, with the majority of applications in the United States directed at Federal government research projects or defense-related government contracts. For example, Smart Material Corporation currently sells the materials to Langley, NASA’s Jet Propulsion Laboratory, and Marshall Space Flight Center, where they are used as strain gauge sensors, as well as to the U.S. Air Force and the U.S. Army.

Product Outcome

Macro-Fiber Composite flexing
When compared to standard piezoelectric systems, the MFC is much more durable and provides increased unidirectional control. Furthermore, the MFC is designed to be readily integrated into a system as an add-on component or integrated during manufacture.

To date, Smart Material Corporation has sold MFCs to over 120 customers, including such industry giants as Volkswagen, Toyota, Honda, BMW, General Electric, and the tennis company, HEAD. The company also estimates that its customers have filed at least 100 patents for their unique uses of the technology.

Smart Material Corporation’s main manufacturing facilities are located in Dresden, Germany, in the vicinity of the Fraunhofer Institute for Ceramic Technologies and Sintered Materials, one of the world’s leading research institutes in the field of advanced ceramics. Dresden is also the center of the German semiconductor industry, and so provides crucial interdisciplinary resources for further MFC refinement. In addition to its Sarasota facility, the company also has sales offices in Dresden and in Tokyo, Japan.

The company’s product portfolio has grown to include piezoceramic fibers and fiber composites, piezoceramic actuators and sensors, and test equipment for these products. It also offers a compact, lightweight power system for MFC testing and validation.

Smart Material Corporation believes that solid-state actuator systems, including piezoceramics, will have healthy commercial growth in the coming years, with increasing penetration in industrial, medical, automotive, defense, and consumer markets. Consumer applications already on the market include piezoelectric systems as part of audio speakers, phonograph cartridges and microphones, and recreational products requiring vibration control, such as skis, snowboards, baseball bats, hockey sticks, and tennis racquets.

Boosting NASA’s Budget Will Help Fix Economy: Neil deGrasse Tyson

 

ItsNeil

 Reinvigorating space exploration in the United States will require not only boosting NASA’s budget but also getting the public to understand how pushing the boundaries of the space frontier benefits the country’s innovation, culture and economy, said renowned astronomer Neil deGrasse Tyson.

“Space is a $300 billion industry worldwide,” Tyson said. “NASA is a tiny percent of that. [But] that little bit is what inspires dreams.”

He spoke about how space has influenced culture — ranging from how the fins on early rockets inspired fins on automobiles in the 1950s, to how the Apollo 8 mission’s iconic picture taken in 1968 of Earth rising above the horizon of the moon led to a greater appreciation for our planet and the need to protect it. Yet, many people outside the space community see itas a special interest group, Tyson said.

“Innovation drives economy,” he said. “It’s especially been true since the Industrial Revolution.”

Tyson advocated doubling NASA’s budget — which President Barack Obama set at $17.7 billion in his 2013 federal budget request — and then laid out a different approach to space exploration that he called somewhat “unorthodox.” Rather than focusing on one destination at a time, Tyson promoted building a core fleet of launch vehicles that can be customized for a variety of missions and for a range of purposes.

“We’re kind of doing that now, but let’s do that as the focus,” Tyson said. “One configuration will get you to the moon. Another will get you to a Lagrangian point. Another will get you to Mars.”

Having an available suite of launch vehicles will open up access to space for a wider range of purposes, which will, in turn, benefit the country’s economy and innovation.

Tyson compared it with the country’s system of interstates, which helped connect cities across the country and made travel more efficient.

“When Eisenhower came back from Europe after he saw the [German] autobahn, and how it survived heavy climactic variation and troop maneuvers, he said, ‘I want some of that in my country,'” Tyson explained. “So he gets everyone to agree to build the interstate system. Did he say, ‘you know, I just want to build it from New York to L.A., because that’s where you should go?’ No. The interstate system connects everybody in whatever way you want. That’s how you grow a system.”

Furthermore, this type of capability can be used for a myriad of purposes, including military endeavors, science missions, commercial expeditions and space tourism.

“Whatever the needs or urges — be they geopolitical, military, economic — space becomes that frontier,” Tyson said. “Not only do you innovate, these innovations make headlines. Those headlines work their way down the educational pipeline. Everybody in school knows about it. You don’t have to set up a program to convince people that being an engineer is cool. They’ll know it just by the cultural presence of those activities. You do that, and it’ll jump-start our dreams.”

[Source]