Tech News Digest: September 2014

Print Article

The following is a roundup of technology-related news and notable developments with a focus on electrical engineering, computing and information technology and allied fields reported during August 2014.  Items are excerpted from news releases generated by universities, government agencies and other research institutions.

Generating Power from the Meeting of River and Sea Water

Where the river meets the sea, there is the potential to harness a significant amount of renewable energy, according to a MIT research team.  The researchers evaluated an emerging method of power generation called pressure retarded osmosis (PRO), in which two streams of different salinity are mixed to produce energy. In principle, a PRO system would take in river water and seawater on either side of a semi-permeable membrane. Through osmosis, water from the less-salty stream would cross the membrane to a pre-pressurized saltier side, creating a flow that can be sent through a turbine to recover power. The MIT team has now developed a model to evaluate the performance and optimal dimensions of large PRO systems. In general, the researchers found that the larger a system’s membrane, the more power can be produced � but only up to a point. Interestingly, 95 percent of a system’s maximum power output can be generated using only half or less of the maximum membrane area.

For more information, see: http://www.eurekalert.org/pub_releases/2014-08/miot-tpo082014.php

Privacy Engineering Workshop Set for September

The National Institute of Standards and Technology (NIST) held its second Privacy Engineering Workshop in San Jose, California on 15-16 September 2014. The event was co-sponsored by the International Association of Privacy Professionals (IAPP) and is part of NIST’s efforts to address the lack of well-developed models, technical standards and best practices in privacy risk management.  The workshop focused on a set of draft privacy engineering objectives and a risk model that were developed by NIST using input from its first workshop on the subject, held in April 2014. That initial meeting attracted participants from a broad array of companies, advocacy groups, associations, government agencies and universities, among others. It explored the idea that dealing with privacy issues needed a framework for analysis analogous to those used in other fields.

For more information, see: http://www.nist.gov/itl/privacy-081214.cfm

Collection of Automobile Speed Data Could Compromise Privacy

Some drivers are jumping at the chance to let auto insurance companies monitor their driving habits in return for a handsome discount on their premiums.  What these drivers may not know is that they could be revealing where they are driving, a privacy boundary that many would not consent to cross.  A team of Rutgers University computer engineers has shown that even without a GPS device or other location-sensing technology, a driver could reveal where he or she traveled with no more information than a starting location and a steady stream of data that shows how fast the person was driving.  The technique, dubbed "elastic pathing," predicts pathways by seeing how speed patterns match street layouts.

For more information, see: http://www.eurekalert.org/pub_releases/2014-08/ru-rrs081114.php

SyNAPSE Program Develops Advanced Brain-Inspired Chip

DARPA-funded IBM researchers have developed one of the world’s largest and most complex computer chips ever produced�one whose architecture is inspired by the neuronal structure of the brain and requires only a fraction of the electrical power of conventional chips. Funded under DARPA’s Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program, the chip is loaded with more than 5 billion transistors and boasts more than 250 million �synapses,� or programmable logic points, analogous to the connections between neurons in the brain. That’s still orders of magnitude fewer than the number of actual synapses in the brain, but a giant step toward making ultra-high performance, low-power neuro-inspired systems a reality.

For more information, see: http://www.darpa.mil/NewsEvents/Releases/2014/08/07.aspx

Atoms to Product: Aiming to Make Nanoscale Benefits Life-sized

DARPA’s new Atoms to Product (A2P) program seeks to develop enhanced technologies for assembling atomic-scale pieces and integrating these components into materials and systems in ways that preserve and exploit distinctive nanoscale properties.   If available in human-scale products and systems, nanoscale materials could offer potentially revolutionary defense and commercial capabilities.

�We want to explore new ways of putting incredibly tiny things together, with the goal of developing new miniaturization and assembly methods that would work at scales 100,000 times smaller than current state-of-the-art technology,� said John Main, DARPA program manager. �If successful, A2P could help enable creation of entirely new classes of materials that exhibit nanoscale properties at all scales. It could lead to the ability to miniaturize materials, processes and devices that can’t be miniaturized with current technology, as well as build three-dimensional products and systems at much smaller sizes.�

For more information, see: http://www.darpa.mil/NewsEvents/Releases/2014/08/22.aspx

Research Paves Way for Development of Cyborg Moth ‘Biobots’

Researchers at North Carolina State University have developed methods for electronically manipulating the flight muscles of moths and for monitoring the electrical signals moths use to control those muscles. The work opens the door to the development of remotely-controlled moths, or ‘biobots,’ for use in emergency response.

For more information, see: http://news.ncsu.edu/releases/bozkurt-moth-jove-2014/

‘Robo Brain’ Tap Internet to Teach Robots

Robo Brain � a large-scale computational system that learns from publicly available Internet resources � is currently downloading and processing about 1 billion images, 120,000 YouTube videos, and 100 million how-to documents and appliance manuals. The information is being translated and stored in a robot-friendly format that robots will be able to draw on when they need it.  The system employs what computer scientists call "structured deep learning," where information is stored in many levels of abstraction. An easy chair is a member of the class of chairs, and going up another level, chairs are furniture. Robo Brain knows that chairs are something you can sit on, but that a human can also sit on a stool, a bench or the lawn.

For more information, see: http://www.eurekalert.org/pub_releases/2014-08/cu-bw082214.php

Global City Teams Challenge Seeks to Create Smart Cities

On 5 August, the National Institute of Standards and Technology (NIST) and several partners kicked off the year-long Global City Teams Challenge to help communities around the world work together to address issues ranging from air quality to traffic management to emergency services coordination. NIST is inviting communities and innovators to create teams that will foster the spread of �smart cities� that take advantage of networked technologies to better manage resources and improve quality of life.  The challenge will kick off 29-30 September 2014, with a two-day workshop at NIST’s Gaithersburg, Md., campus that will bring together city planners and representatives from technology companies, academic institutions and nonprofits. The challenge is open to participants around the world, and international representatives will be able to participate in the kick-off meeting via webcast.

For more information, see: http://www.nist.gov/cps/cps-080514.cfm

No-Power Wi-Fi Connectivity Could Enable Internet of Things

The not-so-distant "Internet of Things" would extend connectivity to perhaps billions of devices. Sensors could be embedded in everyday objects to help monitor and track everything from the structural safety of bridges to the health of your heart. But having a way to cheaply power and connect these devices to the Internet has kept this from taking off.  Now, University of Washington engineers have designed a new communication system that uses radio frequency signals as a power source and reuses existing Wi-Fi infrastructure to provide Internet connectivity to these devices. Called Wi-Fi backscatter, this technology is the first that can connect battery-free devices to Wi-Fi infrastructure.

For more information, see: http://www.eurekalert.org/pub_releases/2014-08/uow-nwc080414.php

Updated NIST Guide Provides Computer Security Assessment Procedures for Core Security Controls

The National Institute of Standards and Technology (NIST) has issued for public comment a draft update of its primary guide to assessing the security and privacy controls that safeguard federal information systems and networks. Public comments are due by 26 September 2014.  NIST publishes two complementary publications that together provide its basic guidance and recommendations for ensuring data security and privacy protection in federal information systems and organizations, a role assigned to NIST under the Federal Information Security Management Act (FISMA).

For more information, see: http://www.nist.gov/itl/csd/800-53a-080114.cfm

Next-Generation Code Optimization Effort To Prepare Users for Transition to Exascale Computing

With the promise of exascale supercomputers looming on the horizon, much of the roadmap is dotted with questions about hardware design and how to make these systems energy efficient enough so that centers can afford to run them. Often taking a back seat is an equally important question: will scientists be able to adapt their applications to take advantage of exascale once it arrives?  The Department of Energy’s (DOE) National Energy Research Scientific Computing Center (NERSC), located at Lawrence Berkeley National Laboratory, is working to address this gap with the NERSC Exascale Science Applications Program (NESAP), a robust application readiness effort launched to support NERSC’s next-generation supercomputer, Cori.   NESAP � which will include partnerships with 20 application code teams and technical support from NERSC, Cray and Intel � was created to make this transition run smoothly.

For more information, see: http://www.nersc.gov/news-publications/news/nersc-center-news/2014/nersc-launches-next-generation-code-optimization-effort/

Chameleon: Cloud Computing Testbed for Computer Science

The National Science Foundation (NSF) has announced a new $10 million project to create a cloud computing testbed called Chameleon, an experimental testbed for cloud architecture and applications. This testbed will enable the academic research community to develop and experiment with novel cloud architectures and pursue new, architecturally enabled applications of cloud computing, specifically for the computer science domain.  Chameleon is designed to support a variety of cloud-related research. To support users building cloud services and platforms, Chameleon will include persistent infrastructure clouds. To support researchers investigating low-level software for clouds, Chameleon will provide "bare metal" provisioning of hardware where users specify, and can modify, the full software stack they will experiment on. For researchers that want dedicated, but not fully custom environments, Chameleon will provide pre-configured software stacks that are provisioned on bare metal.

For more information, see: https://www.tacc.utexas.edu/news/press-releases/2014/chameleon

Laser Device May End Pin Pricks for Diabetics

Princeton University researchers have developed a way to use a laser to measure people’s blood sugar, and, with more work to shrink the laser system to a portable size, the technique could allow diabetics to check their condition without pricking themselves to draw blood.  The laser passes through the skin cells, without causing damage, and is partially absorbed by the sugar molecules in the patient’s body. The researchers use the amount of absorption to measure the level of blood sugar.

For more information, see: http://www.princeton.edu/engineering/news/archive/?id=13390

Photon Speedway Puts Big Data in the Fast Lane 

A series of experiments conducted by Lawrence Berkeley National Laboratory (Berkeley Lab) and SLAC National Accelerator Laboratory (SLAC) researchers is shedding new light on the photosynthetic process. The work also illustrates how light sources and supercomputing facilities can be linked via a "photon science speedway" as a solution to emerging challenges in massive data analysis.  Last year, Berkeley Lab and SLAC researchers led a protein crystallography experiment at SLAC’s Linac Coherent Light Source (LCLS) to look at the different photoexcited states of photosystem II, an assembly of large protein molecules that play a crucial role in photosynthesis. Subsequent analysis of the data on supercomputers at the Department of Energy’s (DOE’s) National Energy Research Scientific Computing Center (NERSC) helped explain how nature splits a water molecule during photosynthesis, a finding that could advance the development of artificial photosynthesis for clean, green and renewable energy.

For more information, see: http://www.eurekalert.org/pub_releases/2014-08/dbnl-psp082514.php

Do We Live In a 2-D Hologram?

A unique experiment at the U.S. Department of Energy’s Fermi National Accelerator Laboratory called the Holometer has started collecting data that will answer some mind-bending questions about our universe � including whether we live in a hologram.

Much like characters on a television show would not know that their seemingly 3-D world exists only on a 2-D screen, we could be clueless that our 3-D space is just an illusion. The information about everything in our universe could actually be encoded in tiny packets in two dimensions. Get close enough to your TV screen and you’ll see pixels, small points of data that make a seamless image if you stand back. Scientists think that the universe’s information may be contained in the same way and that the natural �pixel size� of space is roughly 10 trillion trillion times smaller than an atom, a distance that physicists refer to as the Planck scale.

�We want to find out whether space-time is a quantum system just like matter is,� said Craig Hogan, director of Fermilab’s Center for Particle Astrophysics and the developer of the holographic noise theory. �If we see something, it will completely change ideas about space we’ve used for thousands of years.�

For more information, see: http://www.fnal.gov/pub/presspass/press_releases/2014/2-D-Hologram-20140826.html

 

Leave a Reply