Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Mst. Eshita Khatun

Pages: [1] 2 3 ... 9
In the quest to shrink data storage down into tinier and tinier forms, scientists have scored a very, very small triumph.

They did it by creating what’s essentially an incredibly diminutive magnet: It’s just one atom in size, and while it’s not going to be holding birthday cards up on your refrigerator anytime soon, it can do something else: store a data point.

Described in the journal Nature, the experiment involved atoms of a rare earth element called holmium. Physicists working at an IBM research facility in California found that when the holmium atoms were placed on a special surface made of magnesium oxide, they naturally oriented themselves with a magnetic north and south pole—just like regular magnets have—pointing either straight up or down, and remained that way in a stable condition. What's more, they could make the atoms flip by giving them a zap with a scanning tunneling microscope that has a needle with a tip just one atom wide.

The experiment mimics the way the magnetic disk of a hard drive works. Tiny magnets on those disks point either up or down, and that orientation conveys binary information—either a one or a zero. The bits of information on hard drives are physically much bigger though: they’re made up of about 100,000 to a million atoms. The bits in the IBM experiment are miniscule.

Fabian Natterer, a scientist at the Swiss Federal Institute of Technology in Lausanne and the first author of the study, says the experiment shows that they could store one bit of information in just one atom.

You can picture an atom as looking like a tiny butterfly. The atom itself is the insect's body, while its magnetic field forms a pair of wings. The atom’s position—with the north pole of its magnetic field either up or down—represents the information, either a zero or one.


Stratospheric aerosol geoengineering is the idea that adding a layer of aerosol particles to the upper atmosphere can reduce climate changes caused by greenhouse gases such as carbon dioxide.

Previous research shows that solar geoengineering could be achieved using commercially available aircraft technologies to deliver the particles at a cost of a few billion dollars per year and would reduce global average temperatures. However, the question remains whether this approach could reduce important climate hazards at a regional level. That is, could it reduce region-by-region changes in water availability or extreme temperatures?

Results from a new study by UCL and Harvard researchers suggest that even a crude method like injecting sulfur dioxide in the stratosphere could reduce many important climate hazards without making any region obviously worse off.

The findings, published today in Environmental Research Letters, used results from a sophisticated simulation of stratospheric aerosol geoengineering to evaluate whether the approach could offset or worsen the effects of climate change around the world. How these effects differed under different temperature scenarios was also tested.

The team found that halving warming by adding aerosols to the stratosphere could moderate important climate hazards in almost all regions. They saw an exacerbation of the effects of climate change in only a very small fraction of land areas.

Lead author, Professor Peter Irvine (UCL Earth Sciences), said: “Most studies focus on a scenario where solar geoengineering offsets all future warming. While this reduces overall climate change substantially, we show that in these simulations, it goes too far in some respects leading to about 9% of the land area experiencing greater climate change, i.e. seeing the effects of climate change exacerbated.

“However, if instead only half the warming is offset, then we find that stratospheric aerosol geoengineering could still reduce climate change overall but would only exacerbate change over 1.3% of the land area.”

The team emphasise that solar geoengineering only treats the symptoms of climate change and not the underlying cause, which is the build-up of CO2 and other greenhouse gases in the atmosphere. It should therefore be considered as a complementary approach to emissions cuts as a way to address climate change.

The study is a follow-up to a paper published last year in Nature Climate Change* showed similar results when solar geoengineering was approximated by simply turning down the sun. That prior study begged the question: would the results hold up with a more realistic simulation using injection of sulfur dioxide, the simplest known method of solar geoengineering.

“Our results suggest that when used at the right dose and alongside reductions in greenhouse gas emissions, stratospheric aerosol geoengineering could be useful for managing the impacts of climate change. However, there are still many uncertainties about the potential effects of stratospheric aerosol geoengineering and more research is needed to know if this idea is truly viable,” added Dr. Irvine.

The team used data from the Geoengineering Large Ensemble Study, which used a sophisticated climate-chemistry model to simulate the climate response to a hypothetical deployment of stratospheric aerosol geoengineering. In this model study, sulfur dioxide was released at different latitudes in the Tropics to produce a layer of aerosols tuned to keep temperatures steady under an extreme global warming scenario.

The researchers focused on changes in mean and extreme temperature, changes in water availability and changes in extreme precipitation, i.e. climate variables that determine key climate risks.

Previous work suggested that stratospheric aerosol geoengineering could lead to a substantial weakening of monsoons and an intensification of drought. However, the authors found that in those regions where halving warming with stratospheric aerosol geoengineering exacerbated change, it increased water availability rather than reduced it. This suggests that concerns that stratospheric aerosol geoengineering could lead to aridification and drought could be misplaced.

Co-author, Professor David Keith (Harvard’s Engineering and Applied Sciences and Kennedy school), said: “Early research with climate models consistently shows that spatially uniform solar radiation modification could significantly reduce climate risks when combined with emissions cuts. But, should we trust the models? Uncertainties are deep and no single result is trustworthy, but this paper is a step towards more realistic modeling from injection to regional impacts.”

The team are now researching the projected effects of stratospheric aerosol geoengineering on the water cycle in more depth to try to understand the potential benefits and risks to society and ecosystems.


That future may not be now, but it's one step closer, thanks to a Texas A&M University-led team of scientists and engineers and their recent discovery of a materials-based mimic for the neural signals responsible for transmitting information within the human brain.

The multidisciplinary team, led by Texas A&M chemist Sarbajit Banerjee in collaboration with Texas A&M electrical and computer engineer R. Stanley Williams and additional colleagues across North America and abroad, has discovered a neuron-like electrical switching mechanism in the solid-state material β'-CuxV2O5 -- specifically, how it reversibly morphs between conducting and insulating behavior on command.

The team was able to clarify the underlying mechanism driving this behavior by taking a new look at β'-CuxV2O5, a remarkable chameleon-like material that changes with temperature or an applied electrical stimulus. In the process, they zeroed in on how copper ions move around inside the material and how this subtle dance in turn sloshes electrons around to transform it. Their research revealed that the movement of copper ions is the linchpin of an electrical conductivity change which can be leveraged to create electrical spikes in the same way that neurons function in the cerebral nervous system -- a major step toward developing circuitry that functions like the human brain.

Their resulting paper, which features Texas A&M chemistry graduate students Abhishek Parija (now at Intel Corporation), Justin Andrews and Joseph Handy as first authors, is published Feb. 27 in the Cell Press journal Matter.

In their quest to develop new modes of energy efficient computing, the broad-based group of collaborators is capitalizing on materials with tunable electronic instabilities to achieve what's known as neuromorphic computing, or computing designed to replicate the brain's unique capabilities and unmatched efficiencies.

"Nature has given us materials with the appropriate types of behavior to mimic the information processing that occurs in a brain, but the ones characterized to date have had various limitations," Williams said. "The importance of this work is to show that chemists can rationally design and create electrically active materials with significantly improved neuromorphic properties. As we understand more, our materials will improve significantly, thus providing a new path to the continual technological advancement of our computing abilities."

While smart phones and laptops seemingly get sleeker and faster with each iteration, Parija notes that new materials and computing paradigms freed from conventional restrictions are required to meet continuing speed and energy-efficiency demands that are straining the capabilities of silicon computer chips, which are reaching their fundamental limits in terms of energy efficiency. Neuromorphic computing is one such approach, and manipulation of switching behavior in new materials is one way to achieve it.

"The central premise -- and by extension the central promise -- of neuromorphic computing is that we still have not found a way to perform computations in a way that is as efficient as the way that neurons and synapses function in the human brain," said Andrews, a NASA Space Technology Research Fellow. "Most materials are insulating (not conductive), metallic (conductive) or somewhere in the middle. Some materials, however, can transform between the two states: insulating (off) and conductive (on) almost on command."

By using an extensive combination of computational and experimental techniques, Handy said the team was able to demonstrate not only that this material undergoes a transition driven by changes in temperature, voltage and electric field strength that can be used to create neuron-like circuitry but also comprehensively explain how this transition happens. Unlike other materials that have a metal-insulator transition (MIT), this material relies on the movement of copper ions within a rigid lattice of vanadium and oxygen.

"We essentially show that a very small movement of copper ions within the structure brings about a massive change in conductance in the whole material," Handy added. "Because of this movement of copper ions, the material transforms from insulating to conducting in response to external changes in temperature, applied voltage or applied current. In other words, applying a small electrical pulse allows us to transform the material and save information inside it as it works in a circuit, much like how neurons function in the brain."

Andrews likens the relationship between the copper-ion movement and electrons on the vanadium structure to a dance.

"When the copper ions move, electrons on the vanadium lattice move in concert, mirroring the movement of the copper ions," Andrews said. "In this way, incredibly small movements of the copper ions induce large electronic changes in the vanadium lattice without any observable changes in vanadium-vanadium bonding. It's like the vanadium atoms 'see' what the copper is doing and respond."

Transmitting, storing and processing data currently accounts for about 10 percent of global energy use, but Banerjee says extrapolations indicate the demand for computation will be many times higher than the projected global energy supply can deliver by 2040. Exponential increases in computing capabilities therefore are required for transformative visions, including the Internet of Things, autonomous transportation, disaster-resilient infrastructure, personalized medicine and other societal grand challenges that otherwise will be throttled by the inability of current computing technologies to handle the magnitude and complexity of human- and machine-generated data. He says one way to break out of the limitations of conventional computing technology is to take a cue from nature -- specifically, the neural circuitry of the human brain, which vastly surpasses conventional computer architectures in terms of energy efficiency and also offers new approaches for machine learning and advanced neural networks.

"To emulate the essential elements of neuronal function in artificial circuitry, we need solid-state materials that exhibit electronic instabilities, which, like neurons, can store information in their internal state and in the timing of electronic events," Banerjee said. "Our new work explores the fundamental mechanisms and electronic behavior of a material that exhibits such instabilities. By thoroughly characterizing this material, we have also provided information that will instruct the future design of neuromorphic materials, which may offer a way to change the nature of machine computation from simple arithmetic to brain-like intelligence while dramatically increasing both the throughput and energy efficiency of processors."

Because the various components that handle logic operations, store memory and transfer data are all separate from each other in conventional computer architecture, Banerjee says they are plagued by inherent inefficiencies regarding both the time it takes for information to be processed and how physically close together device elements can be before thermal waste and electrons "accidentally" tunneling between components become major problems. By contrast, in the human brain, logic, memory storage and data transfer are simultaneously integrated into the timed firing of neurons that are densely interconnected in 3-D fanned-out networks. As a result, the brain's neurons process information at 10 times lower voltage and an almost 5,000 times lower synaptic operation energy in comparison to silicon computing architectures. To come close to achieving this kind of energetic and computational efficiency, he says new materials are needed that can undergo rapid internal electronic switching in circuits in a way that mimics how neurons fire in timed sequences.

Handy notes that the team still needs to optimize many parameters, such as transition temperature and switching speed along with the magnitude of the change in electrical resistance. By determining the underlying principles of the MIT in β'-CuxV2O5 as a prototype material within an expansive field of candidates, however, the team has identified certain design motifs and tunable chemical parameters that ultimately prove useful in the design of future neuromorphic computing materials, a major endeavor that has been seeded by the Texas A&M X-Grant Program.

"This discovery is very exciting because it provides fertile ground for the development of new design principles for tuning materials properties and also suggests exciting new approaches to researchers in the field for thinking about energy efficient electronic instabilities," Parija said. "Devices that incorporate neuromorphic computing promise improved energy efficiency that silicon-based computing has yet to deliver, as well as performance improvements in computing challenges like pattern recognition -- tasks that the human brain is especially well-equipped to tackle. The materials and mechanisms we describe in this work bring us one step closer to realizing neuromorphic computing and in turn actualizing all of the societal benefits and overall promise that comes with it."

The multi-year project incorporates team members from four disciplines (chemistry, physics, materials science and engineering, and electrical and computer engineering) and researchers from Texas A&M, Lawrence Berkeley National Laboratory, the University at Buffalo, Binghamton University and Texas A&M University at Qatar while also relying on work performed at Berkeley Lab's The Molecular Foundry and the Advanced Light Source (ALS), the Advanced Photon Source (APS) at Argonne National Laboratory and the Canadian Light Source. The research was funded primarily by the National Science Foundation (Grant No. DMR 1809866) with additional support from a Texas A&M X-Grant and the Qatar National Research Fund.


ICT / Re: Some Smartphone Hacks
« on: March 20, 2020, 10:38:08 AM »
Helpful  :)

ICT / Re: Some Web Browsing Tricks
« on: March 15, 2020, 09:52:32 PM »
Helpful. Thanks :)

NASA has selected the first two scientific investigations to fly aboard the Gateway, an orbital outpost that will support Artemis lunar operations while demonstrating the technologies necessary to conduct a historic human mission to Mars. The instruments selected for Gateway will observe space weather and monitor the Sun’s radiation environment.

“Building the Gateway with our commercial and international partners is a critical component of sustainable lunar exploration and the Artemis program,” said NASA Administrator Jim Bridenstine. “Using the Gateway as a platform for robotic and human exploration around the Moon will help inform what we do on the lunar surface as well as prepare us for our next giant leap – human exploration of Mars.”

The radiation instrument package, built by ESA (European Space Agency,) will help provide an understanding of how to keep astronauts safe by monitoring the radiation exposure in Gateway’s unique orbit.

The space weather instrument suite, built by NASA, will observe solar particles and solar wind created by the Sun. As we move deeper into space, human and robotic explorers face greater challenges from the sometimes violent and unpredictable outbursts of the Sun. The space weather instrument suite will gather data and enhance our ability to forecast events originating from the Sun that could affect our astronauts on and around the Moon as well as on future missions to Mars. 

“Our Sun and the environment around it is very dynamic.  This instrument suite will help us observe the particles and energy that our star emits — and mitigate the risks to astronauts at the Moon and eventually Mars,” said Thomas Zurbuchen, NASA’s associate administrator for science at the agency’s headquarters in Washington. “Not only will we learn more about our space environment, but we’ll also learn how to improve forecasting space weather wherever the Artemis Generation journeys away from Earth.”

Additional scientific payloads will be selected to fly aboard the Gateway in the future.  These investigations will take advantage of the unique environment in lunar orbit, one that cannot be duplicated on Earth or on the International Space Station.

The Gateway will orbit near the Moon and will be occupied periodically by astronauts as part of NASA’s sustainable lunar exploration plans. NASA awarded Maxar Technologies a contract in May 2019 to develop the power and propulsion element which will provide solar arrays and maneuvering capabilities. NASA is continuing negotiations with Northrop Grumman to build the habitation and logistics outpost or HALO, the first pressurized module for crew visiting the Gateway.

ESA, the Japan Aerospace Exploration Agency, and the Canadian Space Agency are all actively engaged in discussions with NASA to support the construction and operation of the Gateway which, again, will support lunar surface missions and pave the way for the human exploration of Mars.   

“This is an incredible moment in human spaceflight as NASA is closer than any other time in history since the Apollo program to returning to the lunar surface,” said Bridenstine. “America is leading a return to the Moon, and this time, we’re taking all of humanity with us to explore long-term and get ready for Mars.”


In the zero-carbon cities of the future, commuting to work may take the form of hailing a driverless shuttle through an app which ferries you from your door to the nearest public transport terminal. In fact, autonomous shuttles have been in development in restricted areas for the past few years. So what will it take to make them part of our daily commute?

Jutting out into the sea, the industrial port area of Nordhavn in Denmark’s capital, Copenhagen, is currently being transformed into a futuristic waterfront city district made up of small islets. It’s billed as Scandinavia’s largest metropolitan development project and, when complete, will have living space for 40,000 people and workspace for another 40,000.

At the moment, Nordhavn is only served by a nearby S-train station and bus stops located near the station. There are no buses or trains running within the development area, although there are plans for an elevated metro line, and parking will be discouraged in the new neighbourhood. This is a great opportunity for autonomous vehicles (AVs) to operate as a new public transport solution, connecting this area more efficiently, says Professor Dimitri Konstantas at the University of Geneva in Switzerland.

‘We believe that AVs will become the new form of transport in Europe,’ he said. ‘We want to prove that autonomous vehicles are a sustainable, viable and environmental solution for urban and suburban public transportation.’

Prof. Konstantas is coordinating a project called AVENUE, which aims to do this in four European cities. In Nordhavn, the team plans to roll out autonomous shuttles on a loop with six stops around the seafront. They hope to have them up and running in two years. But once in place, the Nordhavn plan may provide a glimpse of how AV-based public transportation systems could work in the future.

Prof. Konstantas envisages these eventually becoming an on-demand, door-to-door service, where people can get picked up and go where they want rather than predetermined itineraries and bus stops.

In Nordhavn, AVENUE will test and implement an autonomous ‘mobility cloud’, currently under development, to link the shuttles with existing public transport, such as the nearby train station. An on-demand service will ultimately allow passengers to access the available transport with a single app, says Prof. Konstantas.

Integrating autonomous shuttles into the wider transport system is vital if they are to take off, says Guido Di Pasquale from the International Association of Public Transport (UITP) in Brussels, Belgium.

‘Autonomous vehicles have to be deployed as fleets of shared vehicles, fully integrated and complementing public transport,’ he said. ‘This is the only way we can ensure a sustainable usage of AVs in terms of space occupancy, traffic congestion and the environment.’

Single service

Di Pasquale points to a concept known as Mobility-as-a-Service (MaaS) as a possible model for future transport systems. This model combines both public and private transport. It allows users to create, manage and pay trips as a single service with an online account. For example, Uber, UbiGo in Sweden and Transport for Greater Manchester in the UK are exploring MaaS to enable users to get from one destination to another by combining transport and booking it as one trip, depending on their preferred option based on cost, time and convenience.

Di Pasquale coordinates a project called SHOW, which aims to deploy more than 70 automated vehicles in 21 European cities to assess how they can best be integrated with different wider transport systems and diverse users’ needs. They are testing combinations of AV types, from shuttles to cars and buses, in real-life conditions over the next four years. During this time, he expects the project’s AVs to transport more than 1,500,000 people and 350,000 containers of goods. ‘SHOW will be the biggest ever showcase and living lab for AV fleets,’ he said.

He says that most of the cities involved have tested autonomous last-mile vehicles in the past and are keen to include them in their future sustainable urban mobility plans.

However, rolling out AVs requires overcoming city-specific challenges, such as demonstrating safety.

‘Safety and security risks have restricted the urban use of AVs to dedicated lanes and low speed — typically below 20km/h,’ explained Di Pasquale. ‘This strongly diminishes their usefulness and efficiency, as in most city environments there is a lack of space and a high cost to keep or build such dedicated lanes.’

It could also deter users. ‘For most people, a speed barely faster than walking is not an attractive solution,’ he said.

Di Pasquale hopes novel technology will make higher speed and mixed traffic more secure, and guarantee fleets operating safely by monitoring and controlling them remotely.

Each city participating in SHOW will use autonomous vehicles in various settings, including mixed and dedicated lanes, at various speeds and types of weather. For safety and regulation reasons, all of them will have a driver present.

The objective is to make the vehicle fully autonomous without the need for a driver as well as optimise the service to encourage people to make the shift from ownership of cars to shared services, according to Di Pasquale. ‘This would also make on-demand and last-mile services sustainable in less densely populated areas or rural areas,’ he said.


But the technical issues of making the vehicle autonomous are only a part of the challenge.

There’s also the issue of who pays for it, says Di Pasquale. ‘AVs require sensors onboard, as well as adaptations to the physical and digital infrastructure to be deployed,’ he explained. ‘Their market deployment would require cities to drastically renew their fleets and infrastructures.’

SHOW’s pilots are scheduled to start in two years from now, as each city has to prepare by obtaining the necessary permits and getting the vehicles and technology ready, says Di Pasquale.

Getting authorisation to operate in cities is one of the biggest hurdles. City laws and regulations differ everywhere, says Prof. Konstantas.

AVENUE is still awaiting city licences to test in Nordhavn, despite a national law being passed on 1 July 2017 allowing for AVs to be tested in public areas. Currently, they have pilots taking place in Lyon, France and Luxembourg. In Geneva, the team has managed to get the required licences and the first worldwide on-demand, AV public transportation service will be rolled out on a 69-bus-stop circuit this summer.

AVENUE’s initial results show that cities need to make substantial investments to deploy AVs and to benefit from this technology. The legal and regulatory framework in Europe will also need to be adapted for smooth deployment of services, says Prof. Konstantas.

Both he and Di Pasquale hope their work can pave the way to convince operators and authorities to invest in fleets across Europe’s cities.

‘Depending on the willingness of public authorities, this can take up to four years until we see real, commercially sustainable AV-based public transportation services on a large scale in Europe,’ said Prof. Konstantas.

Source: [url]][url][url][/url][/url]

The World Wide Web was invented almost 30 years ago by Tim Berners-Lee to help people easily share information around the world. Over the following decades, it has changed significantly – both in terms of design and functionality, as well its deeper role in modern society.

Just as the architectural style of a building reflects the society from which it emerges, so the evolution of web design reflects the changing fashions, beliefs and technologies of the time.


Climate-influenced temperatures raised the wildfire risk by 30 percent

Human-caused climate change made southeastern Australia’s devastating wildfires during 2019–2020 at least 30 percent more likely to occur, researchers report in a new study published online March 4.

A prolonged heat wave that baked the country in 2019-2020 was the primary factor raising the fire risk, said climate scientist Geert Jan van Oldenborgh, with the Royal Netherlands Meteorological Institute in De Bilt. The study also linked the extremity of that heat wave to climate change, van Oldenborgh said March 3 during a news conference to explain the findings. Such an intense heat wave in the region is about 10 times more likely now than it was in 1900, the study found.

Van Oldenborgh also noted that climate simulations tend to underestimate the severity of such heat waves, suggesting that climate change may be responsible for even more of the region’s high fire risk. “We put the lower boundary at 30 percent, but it could well be much, much more,” he said.

This week, the southeastern Australia region was declared free of wildfires for the first time in over 240 days, according to a statement March 2 by the New South Wales Rural Fire Service on Twitter. The fires have burned through an estimated 11 million hectares, killing at least 34 people and destroying about 6,000 buildings since early July. About 1.5 billion animals also died in the blazes. Researchers are still tallying the damage and assessing the potential for recovery for many native plant and animal species (SN: 2/11/20).

The climate attribution study was conducted by the World Weather Attribution group, an international consortium of researchers who investigate how much of a role climate change might be playing in natural disasters. Given the quick turnaround time, the study has not yet been peer reviewed. “We wanted to bring the scientific evidence [forward] at a time when the public is talking about the event,” said climate modeler Friederike Otto of the University of Oxford. Then the group examined how climate change altered the Fire Weather Index, an estimation of the risk of wildfires.

The climate simulations show that the probability of a high Fire Weather Index during the 2019–2020 season increased by at least 30 percent, relative to the fire risk in 1910. That is primarily due to the increase in extreme heat; the study was not able to determine the impact of climate change on extreme drought conditions, which also helped fuel the blazes.

Researchers previously have suggested that an El Niño-like atmosphere-ocean weather pattern known as the Indian Ocean Dipole, which was in a strong positive phase in 2019, may have played a role in exacerbating the dry conditions (SN: 1/9/20). Global warming may make such extreme positive phases of this pattern more common. The new study confirmed that the 2019 positive phase made drought conditions more extreme, but could not confirm this particular phase’s relationship to climate change.

“It is always rather difficult to attribute an individual event to climate change,” but this study is nicely done, says Wenju Cai, a climate scientist at CSIRO who is based in Melbourne, Australia. The link identified to climate change is reasonable, if not particularly surprising, he says.

The year 2019 was Australia’s hottest and driest since modern recordkeeping began in the country in 1910. Summers Down Under also appear to be lengthening: The Australia Institute, a Canberra-based think tank, released a report March 2 that found that Australian summers during the years 1999 to 2018 lasted longer by a month, on average, than they did 50 years ago.

Temperature observations going back to 1910 show that the region’s temperatures have risen by about 2 degrees Celsius on average, van Oldenborgh and colleagues report. The climate simulations underrepresented that warming, however, showing an increase of only 1 degree Celsius in that time.

Climate modelers previously have struggled to reconcile the disparity between recorded temperatures and simulated heat waves: Simulations tend to underestimate the severity of the extreme events. The team noticed a similar underestimation in its simulations of the 2019 heat waves in Europe (SN: 7/2/19). Conditions not generally factored into regional climate simulations, such as land-use changes, may be responsible for the disparity. Changes in vegetation cover, for example, can have an impact on how hot or dry a region gets.

Source :

Pages: [1] 2 3 ... 9