Ex-Tesla and NASA Engineers Make a Light Bulb That’s Smarter Than You

 

STACK02 copy

Sometime in early 2013, one of the delivery operations engineers at Tesla leaned back in his chair and took a look around the Silicon Valley office. “It was a sunny day, and I looked up and I thought, ‘Why are these lights on with full power, when full sunlight is coming through the window?’” says Neil Joseph. An online search for a better, responsive bulb only yielded a few expensive commercial products. That October, Joseph (who says even as a kid, his two fascinations were lights and cars) left Tesla to start his own lighting company.

design_disrupt

The company is Stack, and its first product is Alba (alba is Italian for ‘sunrise’). The Alba bulbs are designed to work autonomously, both by adjusting light output based on sunlight and by learning and adapting to its owners’ household habits. As Joseph sees it, it’s the first in a new wave of lighting products to follow the Philips Hue and the LIFX. Those bulbs are smarter than the usual drugstore variety—they can sync with a smartphone app, and even keep rhythm with a song—but they aren’t intelligent by the same standards as the Nest Thermostat or even a tool like Google Maps. In short: They’re connected, but not responsive.

More Than a Gimmick

Embedded in Alba’s light diodes are sensors for motion, occupancy, and ambient light. This meant cofounder Jovi Gacusan, who worked on sensors at NASA, had to create a new core technology, because in order to work efficiently Alba has to both read and react to available light. “If you think about noise-canceling headphones, we have to cancel out the light being emitted by the light itself to understand how much light is in the light source,” Joseph says. The result is a light bulb that can see-saw with natural light and in doing so uses 60 to 80 percent less energy than a regular LED bulb.

STACK05 copy copy

Alba’s other main function involves tailoring itself around its owners. Like the Nest, Alba runs algorithms and remembers user habits at home. “If we notice that people are in a certain part of the house, at certain times of day, and then they mosey on over to a bedroom, and then they spend more time awake in the bedroom before they go to bed we can start to light a pathway,” Joseph says. Alba emits light with blue tones in the morning, to help users become alert, and then glows warmer shades of white as the day wears on. Users can adjust all this in the Stack app, and create profiles (‘dinner party,’ ‘nap time,’ and so on), and Alba will fold that data into its learning curve too.

It’s fairly easy to see how Stack, by riding the light bulb’s coattails, could quickly become the spine of the connected home. Everyone has to buy light bulbs, and many of them. And because each Alba bulb contains a Bluetooth module that acts like iBeacon technology, the hardware is already poised to start talking to other smart gadgets.

For now, it’s being used to track users and their movement patterns around the house. (This feature has a dystopian ring to it. Joseph says, “We don’t track any personal data, or anything that’s on an individual user. It’s environmental and used for learning and product performance.”) Down the line, though, Joseph has ambitions of partnering with other companies, from “thermostats to smart beds that track how people sleep,” to eventually build a home that goes beyond convenience, and is actually healthier. “The more data we have, we can see if you’re in a REM cycle, and then know not to wake you.”

Alba is sold as a starter kit with two bulbs, for $150.

[Source]

Calling All Inventors! NASA Wants Your Ideas For Mindshift

 

Inventors, get your minds in gear to apply NASA’s Mindshift technology to your next invention.  NASA (the National Aeronautics and Space Administration) has partnered with Edison Nation to identify the best consumer inventions or product adaptations to use with its powerful mind control tool, Mindshift.

For those of you who are not familiar with the technology, Mindshift is a physiological feedback system, developed by NASA’s Langley Research Center, that monitors and provides feedback about a person’s heart rate, muscle tension, and brainwave activity – all stress indicators.  The feedback system allows you to control these stress indicators so that you can perform optimally.

Today, NASA’s Mindshift technology is being used alongside motion-sensing gaming technology like the Wii and Xbox.  As the Mindshift controls are independent of the gaming program and control, Mindshift provides biofeedback to the user without interfering with the game, thus allowing the gamer to modify his stress levels so he or she can play better. Mindshift is also being explored for use in educational, psychological, and medial situations.

Now, NASA is looking for new and adapted consumer products to use with Mindshift.  What activities of home or work life would benefit from the ‘mind control’ that the Mindshift technology offers?  What application can you create or modify to enhance this activity?

[Source]

NASA project could be next big thing in video games

The video game’s description promised that two doctors were about to be swallowed by the jaws of fate.

Watching the action on the screen, Alan Pope said, somewhat unnecessarily, “You want to be careful with the scalpel when it’s jumping around.”

“This is really tricky,” replied Chad Stephens, studying the chest he was about to cut open.

The NASA research scientist wore a headset with sensors that monitored his brain waves. As he focused, his brain waves changed frequency, indicated by changing colors of LED lights below the screen. They flickered between red and yellow, then steadied on green. Stephens made a sudden movement with his hand.

“Oh, that’s good!” Pope exclaimed. A robotic voice from the game “Trauma Center: New Blood” intoned, “The incision has been made.”

Pope has a master’s degree in electrical engineering and a doctorate in clinical psychology. Stephens is working on his psychology dissertation. Both work in aeronautics at NASA’s Langley Research Center, where their core research on cockpit design has led to an interesting spinoff: a biofeedback device that makes video games harder for tense or distracted players.

The application is the latest in Pope’s research on brain wave monitoring. His studies on cockpit automation in the 1990s led to the creation of attention-training games by a North
Carolina company.

It’s all part of NASA’s transfer of tax-funded research into the private sector for development of new products and uses that benefit the public as well as the space agency. Langley research has been used by Speedo to create faster racing swimsuits, by Medtronics Inc. to stabilize heart-failure patients, and by NASCAR to clean air inside race cars.

NASA research can be found in medical devices, industrial smokestack scrubbers, panoramic cameras, solar refrigerators, emergency air supplies for coal miners, parachutes for airplanes and more. The agency says its research has been used to save 444,000 lives since 2004, and has generated $5.1 billion for the economy.

The biofeedback technology is commercially available from NASA’s Technology Gateway. Its inventors say it could be used in anything from video games to training simulations for medical doctors to athletes – wherever improved concentration would also improve performance.

“We didn’t invent the Wii, and we didn’t create the game,” Stephens said, “but we figured out how we could modify the controls to serve as feedback to the user.”

In April, NASA obtained a patent for using the technology, called MindShift, on the Wii gaming system. An application has been submitted for Kinect.

“If a company was interested, it could involve this biofeedback mechanism into a game,” Stephens said. “For example, a wizard could use his mental powers to go through the game. Maybe use it in a challenge here and there to demonstrate his skill.

“We could measure fear and have that impact the game in some way. It has the potential to involve a player’s real emotion into the game.”

NASA uses the technology to assess new cockpit designs by monitoring brain waves, heart rate and eye movement of pilots in a simulator.

“We want to be able to sense what kind of state they’re in and provide feedback on a continuous basis so they can understand how their state affects their performance,” Pope said. “If we design something that takes up too much of their attention or doesn’t draw their attention, we probably have poor design.”

Pilots can be taught to focus through standard biofeedback training, where the subject tries to control his heart rate, for example, while moving a line or a bar on a monitor. That can be boring, Pope said. Many of the test pilots enjoyed golf, so the researchers incorporated the biofeedback training into a game where being “in the zone” – mind in perfect sync with body – is important.

They tested a putting green that reacted to a player’s brain waves. When stressed or unfocused, the biofeedback mechanism caused the indoor putting green to undulate, the hole to shrink and a targeting laser to swing back and forth. By relaxing and focusing on the task at hand, players could sink the putt more easily.

Then the researchers tried it with video games, which have real-world possibilities, too.

“Playing a surgery game with an unsteady hand is challenging,” Pope said. “You could imagine this as a surgeon in training learning how to focus, while it teaches the manual skill of surgery.”

They booted up “Link’s Crossbow Training” on the Wii. Pope put on the headset, with a clip on his left earlobe to track his pulse. The cursor jumped around, and he couldn’t aim until his heart rate slowed.

He picked off a single target on the first screen and two targets on the next. Then five targets popped up at once, and the cursor jumped around so he couldn’t aim.

“Got a little excited there,” Pope said. As he calmed down and focused, the cursor steadied and he knocked down the targets one after the other.

“We think this is the next stage beyond motion sensing games,” he said. “That’s why we call it MindShift. We wanted developers to think about games in different ways. That would require a mind shift.”

[Source]

Ultraviolet-Blocking Lenses Protect, Enhance Vision

In the 1980s, Jet Propulsion Laboratory (JPL) scientists James Stephens and Charles Miller were studying the harmful properties of light in space, as well as that of artificial radiation produced during laser and welding work. The intense light emitted during welding can harm unprotected eyes, leading to a condition called arc eye, in which ultraviolet light causes inflammation of the cornea and long-term retinal damage.

alt

Based on work done at NASA’s Jet Propulsion Laboratory, Eagle Eyes lenses filter out harmful radiation, reduce light scattering, and permit vision-enhancing wavelengths of light, protecting eyesight while also improving visibility.

To combat this danger, the JPL scientists developed a welding curtain capable of absorbing, filtering, and scattering the dangerous light. The curtain employed a light-filtering/vision-enhancing system based on dyes and tiny particles of zinc oxide—unique methods they discovered by studying birds of prey. The birds require near-perfect vision for hunting and survival, often needing to spot prey from great distances. The birds’ eyes produce tiny droplets of oil that filter out harmful radiation and permit only certain visible wavelengths of light through, protecting the eye while enhancing eyesight. The researchers replicated this oil droplet process in creating the protective welding curtain.

 

The welding curtain was commercialized, and then the scientists focused attention on another area where blocking ultraviolet light would be beneficial to the eyes: sunglasses. In 2010, the groundbreaking eyewear technology was inducted into the Space Foundation’s Space Technology Hall of Fame, which honors a select few products each year that have stemmed from space research and improved our lives here on Earth.

Partnership

SunTiger Inc.—now Eagle Eyes Optics, of Calabasas, California—was formed to market a full line of sunglasses based on the licensed NASA technology that promises 100-percent elimination of harmful wavelengths and enhanced visual clarity. Today, Eagle Eyes sunglasses are worn by millions of people around the world who enjoy the protective and vision-enhancing benefits.

Product Outcome

The Eagle Eyes lens (right) makes scenes more vivid because harmless wavelength colors such as red, orange, yellow, and green are enhanced, and damaging rays in the blue, violet, and ultraviolet (UV) wavelengths are blocked.

Maximum eye protection from the Sun’s harmful ultraviolet rays is critical to our ability to see clearly. This is because when light enters the eye, a series of events happen which can help, hinder, or even destroy our eyesight. First, light passes through the cornea and ultimately reaches the retina which contains two types of cells—rods (which handle vision in low light) and cones (which handle color vision and detail). The retina contains 100 million rods and 7 million cones. The outer segment of a rod or a cone contains the photosensitive chemical, rhodopsin, also called “visual purple.” Rhodopsin is the chemical that allows night vision, and is extremely sensitive to light. When exposed to a full spectrum of light, rhodopsin immediately bleaches out, and takes about 30 minutes to fully regenerate, with most of the adaption occurring in the dark within 5 to 10 minutes. Rhodopsin is less sensitive to the longer red wavelengths of light and therefore depleted more slowly (which is why many people use red light to help preserve night vision). When our eyes are exposed to the harmful, ultraviolet light rays of the Sun (UVA, UVB, and blue-light rays), damage to our eyes and their complex vision-enhancing processes can occur and not even be noticed until years later, long after exposure.

The most common form of eye damage related to ultraviolet exposure, cataracts, causes the lens of the eye to cloud, losing transparency and leading to reduced vision and, if left untreated, blindness. In the United States alone, it is estimated that cataracts diminish the eyesight of millions of people at an expense of billions of dollars. Other forms of eye damage directly attributable to ultraviolet exposure include pterygium, an abnormal mass of tissue arising from the conjunctiva of the inner corner of the eye; skin cancer around the eyes; and macular degeneration, which damages the center of the eye and prevents people from seeing fine details.

Alan Mittleman, president and CEO of Eagle Eyes explains, “When we’re born, our eyes are clear like drops of water. Throughout life, we start to destroy those sensitive tissues, causing the yellowing of the eyes and the gradual worsening of eyesight. When the eye becomes more and more murky, cataracts form. Simple protection of the human eye, from childhood and throughout adulthood could protect the clarity of the eye and extend good vision for many years—even our entire lifetime.

“It has only been recently,” he adds, “that people started to realize the importance of this.” Sunglass manufacturers are recognizing the importance of eye care, and consumers are becoming more aware of eye health. One issue still plaguing the sunglass market, though, is that consumers assume that darker lenses are more protective, which is not always the case.

 

It may feel more comfortable to wear the dark lenses, but in addition to reducing the field of vision, it relaxes the eye, which allows more blue light to get directly to the retina. Blue light, in particular, has long-term implications, because it passes through the cornea and damages the inner retinal area.

The Eagle Eyes lens allows wearers to see more clearly because it protects from ultraviolet light, but more importantly, blocks this blue light, allowing the good visible light while blocking the harmful wavelengths.

Among their many donations throughout the years and goal of spreading good vision and eye protection to remote areas of the world, Eagle Eyes Optics had the opportunity recently to provide assistance to a group in sore need of eye protection: children in Galena, Alaska. The incidence of cataracts is 300 times greater in Alaska because of the Sun’s reflection off of the snow. Eagle Eyes donated 150 pairs of its sunglasses to a high school in Galena, and they were delivered by members of the Space Foundation and presented by former NASA astronaut Livingston Holder.

[Source]

Why Should We Spend Money on the NASA Space Program?

NASA budget grapghThe point to spending on the space program, for example, is shown by the timeline of spinoff technologies [1]. I’m preparing the fiscal numbers using the information posted on this blog to look into asking, “What is the financial effect of science and technology funding?”, so the Apollo program is a perfect example for looking into this question. As discussed in the timeline, Apollo led to “cool suits alleviate dangers from high-heat environments and medical conditions, Kidney dialysis machines remove toxic waste from used dialysis fluid, A machine aids physical therapy and athletic development, A stress-free ‘blow molding’ process manufactures athletic shoes, Communities benefit from water purification technology, Manufacturers preserve food through freeze-drying, and Sensors detect hazardous gasses.”

In today’s money, the $20.4 billion spent on the Apollo program is equivalent to $109 billion [2]. If we look at the markets that exist now because of these technologies (attempt to estimate the revenue gained if the technology did not now exist), then the dialysis market brings in $16 billion A YEAR (+, more than that due to secondary effects from increased quality of life) [3], the sports coaching market (which would benefit from a “machine aids athletic development”) brings in $6 billion a year (-, less because an athletic development machine would only lead to changes in part of the market) [4], the physical therapy market brings in $30 billion a year (-, less is attributable to the technology) [5], the athletic shoe market brings in $75 billion a year (-, less) [6], the water purification market brings in $20 billion a year (+, more from secondary effects) [7], and the hazardous gas detection market is $2 billion a year (+, more) [8]. Note that I have not been able to assemble fiscal numbers on all of the technologies listed above because of time constraints to do all the research. Also, the spinoff timeline did not mention that the Apollo program led to the development of the expendable launch vehicle market which is $53 billion over the next 10 years [9].

So finally, we can get our hands on a comparison. $20.4 billion spent in 1970 on the space program translates to somewhere around $154.3 billion dollars _A YEAR_ in technology-based markets. What this roughly indicates is that the cuts in NASA funding is the farming equivalent to eating your seed corn.

[References]

[1] http://spinoff.nasa.gov/Spinoff2008/pdf/timeline_08.pdf

[2] http://en.wikipedia.org/wiki/Apollo_program

[3] http://www.firstresearch.com/Industry-Research/Kidney-Dialysis-Centers.html

[4] http://www.ibisworld.com/industry/default.aspx?indid=1542

[5] http://www.ibisworld.com/industry/default.aspx?indid=1562

[6] http://www.prweb.com/releases/2013/6/prweb10808367.htm

[7] http://www.hkc22.com/waterpurification.html

[8] http://www.prnewswire.com/news-releases/global-gas-sensors-detectors-and-analyzers-markets-190181871.html

[9] http://www.militaryaerospace.com/articles/2011/10/expendable-launch.html

Archiving Innovations Preserve Essential Historical Records

NASA Technology

The Moon hosts perhaps the most fascinating museum that no one ever visits. From reflectometers to space boots, the lunar module’s descent stage, and the famous first footprints left behind by Neil Armstrong, the Apollo 11 mission alone left over 100 artifacts on the Moon’s surface.

Among the items at rest in the lunar regolith is an aluminum capsule containing a simple silicon disc about the size of a half-dollar coin. The disc displays messages of goodwill from 73 countries and four US presidents, all inscribed in letters a quarter of the width of a human hair, visible only under a microscope.

altIn order to compile the commemorative messages into a format that would not significantly add to Apollo 11’s payload, while also being able to endure the wild temperature extremes of the Moon’s surface, NASA partnered with Worcester, Massachusetts-based Sprague Electric Company. The company, which had already worked with the Agency to produce multiple components for the Apollo 11 mission, applied semiconductor manufacturing techniques to etch the messages on silicon, sizing them down until each individual message was smaller than the head of a pin.

Sprague patented the technique in 1971, but it would be decades later when a pair of innovators would build upon the NASA-derived technology to create a spinoff that is now preserving vital records on Earth in the same way that the world’s support of Apollo 11 is preserved on the Moon.

Technology Transfer

In 2006, P.R. Mukund, professor of electrical engineering at Rochester Institute of Technology, and PhD student Ajay Pasupuleti took on a task working with a material not typical to their field—palm leaves. The pair joined a project to preserve the content of the Sarvamoola Grantha, an ancient work of Hindu spiritual scholarship transcribed on stacks of palm leaves that, after some 700 years, were in severe danger of being lost to decay. Working with a team experienced in the preservation of ancient documents such as the Archimedes Palimpsest and the Dead Sea Scrolls, Mukund and Pasupuleti captured and preserved the text of the Sarvamoola Grantha using infrared imagery, winning international acclaim in the process.

“After we did that, we thought about creating a technology that would combine the best of ancient preservation technology with the most recent,” says Mukund. There are pros and cons on both ends of the spectrum, he explains. Stone tablets preserve information for thousands of years—but are not exactly easy to share or copy. Electronic images are ideal for sharing and are easy to copy, manipulate, and archive, but rapid changes in digital formats and storage media threaten to make certain formats of archived information obsolete. Physical storage media such as DVDs and even flash memory can also experience “bit rot,” in which data can be lost or corrupted.

“This is not exactly the way you want to save things that are meant to be preserved for 500 or 1,000 years,” Mukund says.

Mukund and Pasupuleti hit upon a solution within their field of expertise. Silicon, used extensively in the semiconductor industry, is highly durable and resistant to water, humidity, and temperatures up to 572 °F. Research into using silicon and semiconductor production techniques to create enduring copies of documents and records turned up the patent from NASA’s Apollo 11 mission. Since the patent had expired and entered the public domain, Mukund and Pasupuleti were able to use it as a building block for their own patented archiving technology. The pair formed NanoArk Corporation of Fairport, New York, to bring the innovation to market.

“People always ask, when we spend taxpayer dollars on the Space Program: Other than going to space, what does the technology do?” Mukund says. “Here is a classic story where it took more than 40 years before somebody actually thought of commercializing it.”


Benefits

altNanoArk’s Waferfiche technology employs a photolithographic process to inscribe minute copies of documents onto thin silicon discs, each of which can fit about 2,000 letter-sized images. The Waferfiche’s inherent material properties render it resistant to fire over a limited duration and, more importantly, practically impervious to water damage. NanoArk notes that the biggest threat to any kind of archived information is moisture.

“Last year, in New York state alone, 65 local governments lost records due to water damage,” says Pasupuleti, now NanoArk’s vice president of technology. “Our technology is water resistant. You can leave it in water for months, take it out, and just wipe it down with a cloth.”

NanoArk claims its Waferfiche technology will preserve essential records for 500 years. While the currently prevalent archiving technology, microfilm, offers the same longevity, it can only do so in temperature and humidity controlled environments. Waferfiche requires no such care, resulting in increasing cost savings over time and minimal environmental impact. If the customer needs to retrieve any archived information from a Waferfiche, the minimal technology required is a magnifying glass.

“There is nothing out there that can beat this technology in terms of long-term preservation of important records and documents,” says Mukund, NanoArk’s president and CEO.

Since introducing the technology commercially in 2008, NanoArk has grown from 2 employees to 10, experienced a five-fold increase in revenue between 2010 and 2011, and recently opened an office in Hyderabad, India. It counts multiple town, county, and state governments among its customers, as well as universities, and has plans to market the technology as a franchise. One of the keys to the innovation’s future success, says Mukund, is its NASA connection.

“When people find out that this technology is what NASA used as a time capsule on the Moon,” he says, “surely that is a good enough credential as far as the viability of the technology for long-term preservation.”

[Source]

NASA Installs LITO Technologies’ Innovative Perimeter Monitoring System

 LITO Technologies Perimeter Security System / LITO Technologies' Perimeter Security System, installed at NASA Glenn Research Center (PRNewsFoto/LITO Technologies, Inc.)A collaboration of LITO Technologies, Inc. and NASA’s Glenn Research Center,Cleveland, Ohio, has resulted in the launch of new technology which will revolutionize security monitoring systems.  Furthermore, this type of collaboration could prove to be a new model to help publicly funded technology make the transition to privately funded commercialization.

The product of NASA’s SBIR Grants, the LITO (Laser Imaging Through Obscurants) system, from LITO Technologies, Inc., allows monitoring and sensing in visually obscured weather conditions such as rain, snow, fog, hail and smoke, sand, and dust. It was designed initially to be an onboard sensor for aircraft landing, but the uses for perimeter screening in top security usage became immediately evident.

LITO allows for continual monitoring of a location or multiple locations regardless of how bad weather scenarios may get.

Last month, the LITO Omnivid-1 security system was installed at NASA Glenn for perimeter protection and will also provide a demonstration for other government agencies that may have similar perimeter security requirements.

Because the LITO system can also see through obscurants such as sand, dust, and even fire, the LITO technology will soon cover many sectors including aircraft imaging, border security, firefighting, search and rescue, and potentially even asteroid mapping.

The entire project is an example of the public and private sectors partnering with the input of advanced-thinking entrepreneurs and scientists like LITO’s Robert ForakerJared Sullivan, and Dr. Richard Billmers who are working in cooperation with NASA technology projects to create and commercialize advanced technical solutions effectively.

Dr. Richard Billmers, also of RL Associates of Yardley, Pennsylvania, had developed elements of the LITO systems for onboard aircraft sensors, under NASA SBIR Aviation Grants in conjunction with NASA Langley Research Center, Hampton Virginia. Successful demonstrations on the ground and in-flight have resulted in a commercialization strategy that begins with perimeter security applications.  The commercialization path will lead to smaller more compact LITO systems that will eventually be used in aircraft to visualize runway scenarios in inclement weather as a part of the Next Generation of Air Flight program.

Billmers was invited by NASA to present his LITO system at their 2012 Technology Days event in Cleveland Ohio. The event was to showcase technology and companies in an effort to stimulated new start ups, and economic growth for the area. The effort seems to have worked.

Robert Foraker, a Private Merchant Banker from Canton, Ohio, and 10 year recipient of NASA agreements, was impressed with the LITO system, when he saw it at the Technology Days event, and authored a collaboration agreement with RL Associates to bring the LITO system to ground based use at NASA Glenn. Foraker fielded the private funding required for the project from a progressive management company called The Woodhaven Group, out of Augusta, Georgia, managed by Jared Sullivan, with whom Foraker had previously worked through his ACT International Incubator.

The entire LITO/NASA collaboration process began and came together in just 51 weeks from the Technology Days event, in 2012, culminating in a Space Act Agreement between NASA’s Glenn Research Center, and LITO in 2013.

This collaboration was made in an effort to get advanced technology commercialized for public use, and help streamline the way the public sector and private sector work together to move technology forward as a whole. Hopefully, this will be just the first of many collaborations between NASA and the private sector to help technological entrepreneurs, by showcasing the technology and moving it to market quickly.

More information on LITO’s laser systems can be found at www.LITOTechnologies.com or email directly to pr@litotechnologies.com.

[Source]

Behavior Prediction Tools Strengthen Nanoelectronics

NASA Technology

Several years ago, NASA started making plans to send robots to explore the deep, dark craters on the Moon. As part of these plans, NASA needed modeling tools to help engineer unique electronics to withstand extremely cold temperatures.

According to Jonathan Pellish, a flight systems test engineer at Goddard Space Flight Center, “An instrument sitting in a shadowed crater on one of the Moon’s poles would hover around 43 K”—that is, 43 kelvin, equivalent to -382 °F. Such frigid temperatures are one of the main factors that make the extreme space environments encountered on the Moon and elsewhere so extreme.

altRadiation is another main concern. “Radiation is always present in the space environment,” says Pellish. “Small to moderate solar energetic particle events happen regularly and extreme events happen less than a handful of times throughout the 7 active years of the 11-year solar cycle.” Radiation can corrupt data, propagate to other systems, require component power cycling, and cause a host of other harmful effects.

In order to explore places like the Moon, Jupiter, Saturn, Venus, and Mars, NASA must use electronic communication devices like transmitters and receivers and data collection devices like infrared cameras that can resist the effects of extreme temperature and radiation; otherwise, the electronics would not be reliable for the duration of the mission.

Technology Transfer

Since 1987, NASA has partnered with Huntsville, Alabama-based CFD Research Corporation (CFDRC), a company that specializes in engineering simulations and innovative designs and prototypes for aerospace and other industries. A few years ago, CFDRC received funding from Marshall Space Flight Center’s Small Business Innovation Research (SBIR) program to refine an existing software tool to predict the behavior of electronics in the cold, radiation-filled environment of space.

During the first phase of its work, in collaboration with Georgia Tech, CFDRC enhanced and demonstrated a technology called NanoTCAD for predicting the response of silicon-germanium (SiGe) semiconductor technology to radiation. During its second phase, the company demonstrated and validated NanoTCAD for temperature ranges from -382–266 °F.

Marek Turowski, the director of the nanoelectronic and plasma technology group at CFDRC explains how, as electronic parts become smaller, the effects of radiation and temperature become more severe. “When radiation particles bombard a microchip, it is like hail hitting a car,” he says.

Even though hail may not damage a large truck, the same hail could cause significant damage to a truck the size of a toy. Likewise, as electronic devices decrease in size, radiation particles can damage them more easily.

Being able to predict the behavior of nanoelectronics in the extreme space environment reduces the risk of failure during a critical NASA mission. Using NanoTCAD, designers can better evaluate performance and response of electronics early in the design stage, thereby reducing the costs and testing time involved. As Turowski explains it, “The purpose of NanoTCAD tools and models is to predict the behavior of electronics in space before they actually go to space. The prediction happens on the computer screen and accurately takes temperature and radiation into account.”

Pellish says NanoTCAD has already been used to evaluate key technologies for the Ice, Cloud, and land Elevation Satellite-2 (ICESat-2), scheduled for launch in 2016. ICESat-2 will look at polar ice, sea-level change, vegetation canopy height, and climate. “The NanoTCAD research on SiGe semiconductor technology processes provided a portion of the necessary insight into this technology so that it can be used in space,” he says.


Benefits

altNanoTCAD software is now available from CFDRC as a nanotechnology computer aided design (CAD) tool to predict the effects of extreme thermal and radiation environments on electronic systems. It is also used by CFDRC in its modeling and simulation services provided to the aerospace industry. The “nano” part of the product’s name means the software can address nano-size devices while “TCAD” stands for “technology computer aided design.”

“It solves basic physics equations,” says Turowski. “It looks at how electrons flow, how fields inside the devices behave, and how the varying temperature affects their behavior.”

Today, CFDRC’s NanoTCAD customers include electronic chip designers at Georgia Tech and Vanderbilt University. The electronics, chips, circuits, and devices that the universities are modeling with NanoTCAD are often for NASA missions. The European Space Agency and the Japanese Aerospace Exploration Agency are also potential customers of CFDRC’s NASA-improved technology.

The tool is also being employed for Department of Defense applications for space communication and surveillance systems for satellites. Entities like the Air Force and Navy design electronics that can suffer the same problems as NASA spacecraft. CFDRC also uses NanoTCAD to provide modeling, simulations, and radiation-hardening design services to national nuclear laboratories and commercial satellite designers.

According to CFDRC, the technology has led to approximately $2 million in revenue for the company, created new jobs, and led to partnerships with other defense and industrial customers.

“NASA has given us the opportunity to develop valuable technology,” says Turowski. “Now the technology is being adapted and enhanced for every new generation of electronics.”

Whether it is for the Moon, on-orbit, or other applications, CFDRC’s work with NASA is helping to make future space missions possible.

[Source]

Simulation Packages Expand Aircraft Design Options

NASA Technology

When engineers explore designs for safer, more fuel efficient, or faster aircraft, they encounter a common problem: they never know exactly what will happen until the vehicle gets off the ground.

“You will never get the complete answer until you build the airplane and fly it,” says Colin Johnson of Desktop Aeronautics. “There are multiple levels of simulation you can do to approximate the vehicle’s performance, however.”

altWhen designing a new air vehicle, computational fluid dynamics, or CFD, comes in very handy for engineers. CFD can predict the flow of fluids and gasses around an object—such as over an aircraft’s wing—by running complex calculations of the fluid physics. This information is helpful in assessing the aircraft’s aerodynamic performance and handling characteristics.

In 2001, after several years of development, NASA released a new approach to CFD called Cart3D. The tool provides designers with an automated, highly accurate computer simulation suite to streamline the conceptual analysis of aerospace vehicles. Specifically, it allows users to perform automated CFD analysis on complex vehicle designs. In 2002, the innovation won NASA’s Software of the Year award.

Michael Aftosmis, one of the developers of Cart3D and a fluid mechanics engineer at Ames Research Center, says the main purpose of the program was to remove the mesh generation bottleneck from CFD. A major benefit of Cart3D is that the mesh, or the grid for analyzing designs, is produced automatically. Traditionally, the mesh has been generated by hand, and requires months or years to produce for complex vehicle configurations. Cart3D’s automated volume mesh generation enables even the most complex geometries to be modeled hundreds of times faster, usually within seconds. “It allows a novice user to get the same quality results as an expert,” says Aftosmis.

Now, a decade later, NASA continues to enhance Cart3D to meet users’ needs for speed, power, and flexibility. Cart3D provides the best of both worlds—the payoff of using a complex, high-fidelity simulation with the ease of use and speed of a much simpler, lower-fidelity simulation method. Aftosmis explains how instead of simulating just one case, Cart3D’s ease of use and automation allows a user to efficiently simulate many cases to understand how a vehicle behaves for a range of conditions. “Cart3D is the first tool that was able to do that successfully,” he says.

At NASA, Aftosmis estimates that 300–400 engineers use the package. “We use it for space vehicle design, supersonic aircraft design, and subsonic aircraft design.”

Technology Transfer

To enable more use of Cart3D for private and commercial aviation entities, the Small Business Innovation Research (SBIR) program at Langley Research Center provided funds to Desktop Aeronautics, based in Palo Alto, California, to build a plug-in to Cart3D that increases the code’s accuracy under particular flow conditions. Aftosmis says Desktop Aeronautics delivered valuable results and made Cart3D more applicable for general use. “Now they are bringing the product to market. This is something we never would have had the time to do at NASA. That’s the way the SBIR process is supposed to work.”

In 2010, Desktop Aeronautics acquired a license from Ames to sell Cart3D. The company further enhanced the software by making it cross-platform, incorporated a graphical user interface, and added specialized features to enable extra computation for the analysis of airplanes with engines and exhaust.

“I think it’s going to be game-changing for CFD,” says Aftosmis. “Cart3D is the only commercial simulation tool that can guarantee the accuracy of every solution the user does.”


Benefits

altToday, Desktop Aeronautics employs Cart3D in its consulting services and licenses the spinoff product to clients for in-house use. The company provides commercial licenses and academic licenses for research and development projects.

The software package allows users to perform automated CFD analysis on complex designs and, according to the company, enables geometry acquisition and mesh generation to be performed within a few minutes on most desktop computers.

Simulations generated by Cart3D are assisting organizations in the design of subsonic aircraft, space planes, spacecraft, and high speed commercial jets. Customers are able to simulate the efficiency of designs through performance metrics such as lift-to-drag ratio.

“It will assemble a spectrum of solutions for many different cases, and from that spectrum, the cases that perform best give insight into how to improve one’s design,” says Johnson. “Cart3D’s preeminent benefit is that it’s automated and can handle complex geometry. It’s blazing fast. You push a button, and it takes care of the volume meshing and flow measurement.”

Without building an aircraft, engineers can never be completely certain which design concept will perform best in flight. However, they now have a tool to make the most informed prediction possible.

[Source]

Web Solutions Inspire Cloud Computing Software

NASA Technology

In 2008, a NASA effort to standardize its websites inspired a breakthrough in cloud computing technology. The innovation has spurred the growth of an entire industry in open source cloud services that has already attracted millions in investment and is currently generating hundreds of millions in revenue.

William Eshagh was part of the project in the early days, when it was known as NASA.net. “The feeling was that there was a proliferation of NASA websites and approaches to building them. Everything looked different, and it was all managed differently—it was a fragmented landscape.”

altNASA.net aimed to resolve this problem by providing a standard set of tools and methods for web developers. The developers, in turn, would provide design, code, and functionality for their project while adopting and incorporating NASA’s standardized approach. Says Eshagh, “The basic idea was that the web developer would write their code and upload it to the website, and the website would take care of everything else.”

altEven though the project was relatively narrow in its focus, the developers soon realized that they would need bigger, more foundational tools to accomplish the job. “We were trying to create a ‘platform layer,’ which is the concept of giving your code over to the service. But in order to build that, we actually needed to go a step deeper,” says Eshagh.

That next step was to create an “infrastructure service.” Whereas NASA.net was a platform for dealing with one type of application, an infrastructure service is a more general tool with a simpler purpose: to provide access to computing power. While such computing power could be used to run a service like NASA.net, it could also be used for other applications.

Put another way, what the team came to realize was that they needed to create a cloud computing service. Cloud computing is the delivery of software, processing power, and storage over the Internet. Whether these resources are as ordinary as a library of music files or as complex as a network of supercomputers, cloud computing enables an end user to control them remotely and simply. “The idea is to be able to log on to the service and say ‘I want 10 computers,’ and within a minute, I can start using those computers for any purpose whatsoever,” says Eshagh.

As the scope of the project expanded, NASA.net came to be known as Nebula. Much more than setting standards for Agency web developers, Nebula was intended to provide NASA developers, researchers, and scientists with a wide range of services for accessing and managing the large quantities of data the Agency accumulates every day. This was an enormous undertaking that only a high-powered cloud computing platform could provide.

Raymond O’Brien, former program manager of Nebula, says the project was in some ways ahead of its time. “Back in 2008 and 2009, people were still trying to figure out what ‘cloud’ meant. While lots of people were calling themselves ‘cloud enabled’ or ‘cloud ready,’ there were few real commercial offerings. With so little clarity on the issue, there was an opportunity for us to help fill that vacuum.”


altAs the team built Nebula, one of the most pressing questions they faced was that of open source development, or the practice of building software in full view of the public over the Internet.

On the one hand, proprietary code might have helped the project overcome early hurdles, as commercial software can offer off-the-shelf solutions that speed up development by solving common problems. Proprietary software is sometimes so useful and convenient that the Nebula team wasn’t even sure that they could create the product without relying on closed source solutions at some point.

On the other hand, open source development would facilitate a collaborative environment without borders—literally anyone with the know-how and interest could access the code and improve on it. Because Nebula had evolved into a project that was addressing very general, widespread needs—not just NASA-wide, but potentially worldwide—the possibility of avoiding restrictive licensing agreements by going open source was very attractive.

O’Brien says that broad appeal was an important part of Nebula’s identity. “From the beginning, we wanted this project to involve a very large community—private enterprises, academic institutions, research labs—that would take Nebula and bring it to the next level. It was a dream, a vision. It was that way from the start.”

Despite uncertainties, the development team decided to make Nebula purely open source. Eshagh says the real test for that philosophy came when those constraints were stretched to their limits. “Eventually, we determined that existing open source tools did not fully address Nebula’s requirements,” he says. “But instead of turning to proprietary tools, we decided to write our own.”

The problem was with a component of the software called the cloud controller, or the tool that can turn a single server or pool of servers into many virtual servers, which can then be provisioned remotely using software. In effect, the controller gives an end user access in principle to as much or as little computing power and storage as is needed. Existing tools were either written in the wrong programming language or under the wrong software license.

Within a matter of days, the Nebula team had built a new cloud controller from scratch, in Python (their preferred programming language for the controller), and under an open source license. When the team announced this breakthrough on its blog, they immediately began attracting attention from some of the biggest players in the industry. “We believed we were addressing a general problem that would have broad interest,” says Eshagh. “As it turns out, that prediction couldn’t have been more accurate.”

Technology Transfer

Rackspace Inc., of San Antonio, Texas, was one of the companies most interested in the technology. Rackspace runs the second largest public cloud in the world and was at the time offering computing and storage services using software they had created in-house. Jim Curry, general manager of Rackspace Cloud Builders, says they faced hurdles similar to those NASA faced in building a private cloud. “We tried to use available technology,” he says, “but it couldn’t scale up to meet our needs.”

The engineers at Rackspace wrote their own code for a number of years, but Curry says they didn’t see it as a sustainable activity. “We’re a hosting company—people come to us when they want to run standard server environments with a high level of hosting support that we can offer them. Writing proprietary code for unique technologies is not something we wanted to be doing long-term.”

The developers at Rackspace were fans of open source development and had been looking into open source solutions right at the time the Nebula team announced its new cloud controller. “Just weeks before we were going to announce our own open source project, we saw that what NASA had released looked very similar to what we were trying to do.” Curry reached out to the Nebula team, and within a week the two development teams met and agreed that it made sense to collaborate on the project going forward.

Each of the teams brought something to the table, says Curry. “The nice thing about it was that we were more advanced than NASA in some areas and vice versa, and we each complemented the other very well. For example, NASA was further along with their cloud controller, whereas we were further along on the storage side of things.”

The next step was for each organization to make its code open source so the two teams could launch the project as an independent, open entity. Jim Curry says the team at Rackspace was stunned by the speed at which NASA moved through the process. “Within a period of 30–45 days, NASA completed the process of getting the agreements to have this stuff done. From my perspective, they moved as fast as any company I’ve ever worked with, and it was really impressive to watch.”

The OpenStack project, the successor to Nebula with development from Rackspace, was announced in July 2010. As open source software, OpenStack has attracted a very broad community: nearly 2,500 independent developers and 150 companies are a part of it—including such giants as AT&T, HP, Cisco, Dell, and Intel. Semi-annual developers’ conferences, where members of the development community meet to exchange ideas and explore new directions for the software, now attract over 1,000 participants from about two dozen different countries.

Benefits

Because OpenStack is free, companies who use it to deploy servers do not need to pay licensing fees—fees that can easily total thousands of dollars per server per year. With the number of companies that have already adopted OpenStack, the software has potentially saved millions of dollars in server costs.

“Before OpenStack,” says Curry, “your only option was to pay someone money to solve the problem that OpenStack is addressing today. For people who want it as a solution, who like the idea of consuming open source, they now have an alternative to proprietary options.”

Not only is OpenStack saving money; it is also generating jobs and revenue at a remarkable pace. Curry says that dozens of Rackspace’s 80 cloud engineering jobs are directly attributable to OpenStack, and that the technology has created hundreds of jobs throughout the industry. “Right now, trying to find someone with OpenStack experience, especially in San Francisco, is nearly impossible, because demand is so high.”

The technology is currently generating hundreds of millions in revenue: Rackspace’s public cloud alone— which largely relies on OpenStack—currently takes in $150 million a year. Curry, Eshagh, and O’Brien all predict that the software will be its own billion-dollar industry within a few years.

Because OpenStack is open source, and is modified and improved by the people who use it, it is more likely to remain a cutting-edge solution for cloud computing needs. Says Eshagh, “We are starting to see the heavyweights in the industry adding services on top of OpenStack—which they can do because they have a common framework to build from. That means we’ll see even more services and products being created.”

In 2012, Rackspace took steps to secure OpenStack’s future as a free and open source project: the company began the process of spinning off the platform into its own nonprofit organization. By separating itself from any one commercial interest, Curry says, the project will be better positioned to continue doing what its founders hoped it would.

O’Brien maintains that OpenStack’s potential is far from being realized. “It’s hard to characterize in advance. If you had asked an expert about Linux years ago, who could have predicted that it would be in nearly everything, as it is today? It’s in phones and mobile devices. It’s in 75 percent of deployed servers. It’s even used to support space missions. OpenStack has a chance to hit something similar to that in cloud computing.”

Curry agrees: “In the future, you can envision almost all computing being done in the cloud, much of which could be powered by OpenStack. I think that NASA will need to receive significant credit for that in the history books. What we’ve been able to do is unbelievable— especially when you remember that it all started in a NASA lab.”

[Source]