Mediterranean wineries are in a climate hotspot. Climatologists are helping them adapt

To help the region’s agricultural producers cope with shifting weather patterns, and make strategic decisions now for the future, scientists are researching new growing techniques, and creating climate forecasts.

The Mediterranean Basin – comprising countries bordering the Mediterranean Sea – is a climate hotspot. It is experiencing faster than average rises in temperature and may suffer major losses of rainfall in future decades.

Wine makers are among those already feeling the effects.

‘Climate change is not only a thing of the future it is happening now. We see an increase of mean temperatures, and this already has an impact on grape growing,’ said Josep Maria Solé Tasias, coordinator of VISCA, a project developing forecasts and pruning techniques to help vineyards adapt to climate change.

One impact is that higher temperatures make grapes ripen too early, before their aromas have had a chance to fully develop. ‘That is something the wineries are very worried about,’ said Solé Tasias who is a civil engineer at Meteosim SL, a Spanish company offering meteorological services.

In southwestern France, the Bordeaux region’s famous Merlot and Sauvignon blanc grapes are expected to be victims of climate change, so wine makers there are testing more resilient grape varieties from southern and eastern Europe.

Another solution is to find plots of land in more northern or elevated cooler locations to plant for the future.

But small wineries will find it difficult to make such large investments, says Solé Tasias. So VISCA has been testing some innovative farming techniques to see if they can minimise the damage.

These include ‘crop forcing’, which involves pruning vines so the grapes mature later in the growing season once temperatures have dropped. But deciding when to prune is difficult – too early or too late in the growing season would impact the harvest.

Forecasts

VISCA has developed seasonal forecasts which are helping farmers assess the best times to apply these techniques. They use detailed data about the vineyard – including location, soil type, and grape variety – to estimate when vines will produce buds or grapes will ripen, as well as predicting temperatures and rainfall.

But unlike short-term weather forecasts which can accurately predict whether there will be a frost or warm sunshine, seasonal forecasts of up to six months ahead are much less certain. Knowing how to use them for decision-making is complex, says Solé Tasias.

‘Farmers at the moment don’t know exactly how to use them – they are used to making decisions in the short-term,’ said Solé Tasias.

A seasonal forecast could for example say there is 60% probability there will be a particularly warm summer. If a farmer delays the ripening of their grapes based on this assumption, they may lose money if the summer turns out to be normal.

‘Farmers have to understand that their decision can result in losses,’ said Solé Tasias.

To help with this, VISCA has worked with some wineries to create a list of actions based on each short-term and seasonal forecast – for example, buy more chemicals to deal with a possible spike in pest numbers, or prune the vines to delay the grape harvest – and spell out the financial risks associated with each option.

The options and risks will be tailored to each vineyard or winery. And the more information the researchers have about the vineyard, the better they can forecast, they say.

Unpredictability

Long-term climate forecasting is particularly difficult in the Mediterranean region, says Dr Alessandro Dell’Aquila, co-coordinator of the MED-GOLD project, which is developing climate services for pasta, olive oil and wine producers.

‘It has an intrinsic unpredictability because there is a lot of noise due to large-scale (atmospheric) movements and perturbations,’ said Dr Dell’Aquila, who is a climatologist at the Italian National Agency for New Technologies, Energy and Sustainable Economic Development (ENEA).

The tropics, by contrast, are more stable, which means that seasonal forecasts for coffee, tea, maize and other crops in parts of Africa and South America could be more accurate.

But seasonal forecasts will still be vital for Mediterranean farmers despite their uncertainty, says Dr Dell’Aquila.

The longer-term impacts of climate change on the Mediterranean are likely to be severe.

‘The Mediterranean could look very different in future decades. We may have completely different species of animals or insects that could arrive from the tropics, and we could experience loss of local biodiversity,’ said Dr Dell’Aquila.

We could also have less water available, including for agricultural purposes, he says. ‘And the region may experience a higher number of (severe) heatwaves.’

Some crops will need to be grown on higher ground or further north where the climate will be cooler and wetter. More field irrigation will be needed and, in the case of grapes, different varieties will have to be grown.

Parts of Europe may open up for wine and olive oil production for the first time, while other areas may see a collapse.

‘There are some ideas of moving olive trees northward to new growing regions. And parts of the Mediterranean – for example, north Africa – could become too hot for olive groves.’

Similarly, while wine production has recently expanded in the UK and Denmark, certain southern Italian wines may become extremely rare within the next decade, Dr Dell’Aquila says.

Support

EU policy needs to change to support producers adapting to climate change, he says. Rules that regulate the composition of wines, for example, could be changed to allow producers to use different varieties of grape – even grapes from different regions – without changing the name of the wine. ‘This could be very important for consumers because they want to go to the supermarket and find a (Chianti), and the name of this wine is clearly defined in some EU rules.’

In the meantime, producers need to act now. ‘Wine-makers should start thinking now where they can buy new plots of land and start planting grapes as an investment for the next 10 or 20 years,’ said Dr Dell’Aquila.



Technology.org

Uncovering the mysteries of milk

Sarah Nyquist got her first introduction to biology during high school, when she took an online MIT course taught by genomics pioneer Eric Lander. Initially unsure what to expect, she quickly discovered biology to be her favorite subject. She began experimenting with anything she could find, beginning with an old PCR machine and some dining hall vegetables.

Nyquist entered college as a biology major but soon gravitated toward the more hands-on style of the coursework in her computer science classes. Even as a computer science major and a two-time summer intern at Google, biology was never far from Nyquist’s mind. Her favorite class was taught by a computational biology professor: “It got me so excited to use computer science as a tool to interrogate biological questions,” she recalls.

During her last two years as an undergraduate at Rice University, Nyquist also worked in a lab at Baylor College of Medicine, eventually co-authoring a paper with Eric Lander himself.

Nyquist is now a PhD candidate studying computational and systems biology. Her work is co-advised by professors Alex Shalek and Bonnie Berger and uses machine learning to understand single-cell genomic data. Since this technology can be applied to nearly any living material, Nyquist was left to choose her focus.

After shifting between potential thesis ideas, Nyquist finally settled on studying lactation, an important and overlooked topic in human development. She and postdoc Brittany Goods are currently part of the MIT Milk Study, the first longitudinal study to profile the cells in human breast milk using single cell genomic data. “A lot of people don’t realize there’s actually live cells in breast milk. Our research is to see what the different cell types are and what they might be doing,” Nyquist says.

While she started out at MIT studying infectious diseases, Nyquist now enjoys investigating basic science questions about the reproductive health of people assigned female at birth. “Working on my dissertation has opened my eyes to this really important area of research. As a woman, I’ve always noticed a lot is unknown about female reproductive health,” she says. “The idea that I can contribute to that knowledge is really exciting to me.”

The complexities of milk

For her thesis, Nyquist and her team have sourced breast milk from over a dozen donors. These samples are provided immediately postpartum to around 40 weeks later, which provides insight into how breast milk changes over time. “We took record of the many changing environmental factors, such as if the child had started daycare, if the mother had started menstruating, or if the mother had started hormonal birth control,” says Nyquist. “Any of these co-factors could explain the compositional changes we witnessed.”

Nyquist also hypothesized that discoveries about breast milk could be a proxy for studying breast tissue. Since breast tissue is necessary for lactation, researchers have been historically struggled to collect tissue samples. “A lot is unknown about the cellular composition of human breast tissue during lactation, even though it produces an important early source of nutrition,” she adds.

Overall, the team has found a lot of heterogeneity between donors, suggesting breast milk is more complicated than expected. They have witnessed that the cells in milk are composed primarily of a type of structural cells that increase in quantity over time. Her team hypothesized that this transformation could be due to the high turnover of breast epithelial tissue during breastfeeding. While the reasons are still unclear, their data add to the field’s previous understandings.

Other aspects of their findings have validated some early discoveries about important immune cells in breast milk. “We found a type of macrophage in human breast milk that other researchers have identified before in mouse breast tissue,” says Nyquist. “We were really excited that our results confirmed similar things they were seeing.”

Applying her research to Covid-19

In addition to studying cells in breast milk, Nyquist has applied her skills to studying organ cells that can be infected by Covid-19. The study began early into the pandemic when Nyquist and her lab mates realized they could explore their lab’s collective cellular data in a new way. “We began looking to see if there were any cells that expressed genes that can be hijacked for cellular entry by the Covid-19 virus,” she says. “Sure enough, we found there are cells in nasal, lung, and gut tissues that are more susceptible to mediating viral entry.”

Their results were published and communicated to the public at a rapid speed. To Nyquist, this was evidence for how collaboration and computational tools are essential at producing next-generation biological research. “I had never been on a project this fast-moving before — we were able to produce figures in just two weeks. I think it was encouraging to the public to see that scientists are working on this so quickly,” she says.

Outside of her own research, Nyquist enjoys mentoring and teaching other scientists. One of her favorite experiences was teaching coding at HSSP, a multiweekend program for middle and high schoolers, run by MIT students. The experience encouraged her to think of ways to make coding approachable to students of any background. “It can be challenging to figure out whether to message it as easy or hard, because either can scare people away. I try to get people excited enough to where they can learn the basics and build confidence to dive in further,” she says.

After graduation, Nyquist hopes to continue her love for mentoring by pursuing a career as a professor. She plans on deepening her research into uterine health, potentially by studying how different infectious diseases affect female reproductive tissues. Her goal is to provide greater insight about biological processes that have long been considered taboo.

“It’s crazy to me that we have so much more to learn about important topics like periods, breastfeeding, or menopause,” says Nyquist. “For example, we don’t understand how some medications impact people differently during pregnancy. Some doctors tell pregnant people to go off their antidepressants because they worry it might affect their baby. In reality, there’s so much we don’t actually know.”

“When I tell people that this is my career direction, they often say that it’s hard to get funding for female reproductive health research, since it only affects 50 percent of the population,” she says.

“I think I can convince them to change their minds.”

Written by Hannah Meiseles

Source: Massachusetts Institute of Technology




Technology.org

Launch of a new wiki for fish-friendly hydropower plants

Scientists from SINTEF are launching a new wiki-website that lists tools and measures available to make hydropower more fish-friendly. The project draws on the concept of environmental design, which integrates technical, economic, environmental and social aspects when developing or re-designing energy projects.

Øvre Leirfoss. Image credit: SINTEF

– The objective of the wiki is to improve the conditions for fish while preserving hydropower production, by presenting the most up-to-date measures and innovative tools brought to us by environmental design, says Bendik Torp Hansen, who leads the project at SINTEF Energy.

The wiki is a product of the EU-project FIThydro. Its main target audiences are regulatory bodies who want to adopt new protective measures, hydropower companies, consultants, stakeholders and scientists.

– Climate change requires us to increase renewable energy generation including hydropower. We also need to use hydropower in a flexible way to integrate wind and solar energy, says SINTEF Energy senior researcher Atle Harby.

– This must be done respecting nature, which means we need environmental design. Environmental design is a method that balances the needs of energy production and the environment by combining knowledge about hydropower, hydrology, technology, economy, biology and society, Atle Harby adds.

A one-stop shop for all the measures

– The wiki gathers all the measures that can be adopted to make hydropower plants more fish-friendly while maintaining or even increasing electricity generation, says Bendik Torp Hansen. We assembled as much information as possible in one place, and that hadn’t been done before.

The wiki presents a catalog of mitigation measures that can be used to solve challenges related to environmental flows, habitat conditions, sediment management, and up- and downstream migration for fish. The wiki also provides information on methods, tools and devices that can be used to plan, implement, monitor and maintain the measures.

A solid knowledge base is important and necessary to ensure a good balance between advantages and disadvantages of various environmental mitigation measures in any given case. A knowledge database such as this wiki will contribute to that, says senior engineer Eilif Brodtkorb of the Norwegian Water Resources and Energy Directorate (NVE).

Every measure is described with a classification table that provides extra information, such as the type of river it’s suitable for, what species of fish it can help, as well as the maturity level of the technology. The cost of every measure is also listed, when the available data makes this possible.

– The wiki is open to everyone in order to make our project results public, says Torp Hansen.

Decision support tool

The FIThydro project also created a decision support system (DSS) which is linked to the wiki. Together, the DSS and the wiki help users to find  possible and suitable mitigation measures.

–  The wiki can support decision processes and contribute to simplifying communication and debate by giving users a common knowledge base, says Torp Hansen. It is a living document that will continuously be updated by experts with the newest technology and research. 

Source: SINTEF




Technology.org

Manchester researchers team up with Callaly to discover recyclable materials for menstrual products

Researchers from the Henry Royce Institute at The University of Manchester’s Sustainable Materials Innovation Hub (SMI Hub) in collaboration with Callaly are working together to find alternative sustainable materials for menstrual hygiene products to help combat the growing need for natural-renewable alternatives for plastics.

New funding from the Engineering and Physical Sciences Research Council has enabled the collaboration which seeks to use surplus materials from shellfish industries to develop bioplastics. The novel materials will replace the stretchable films in feminine hygiene products with an ultimate aim to reduce the use of non-renewable and non-recyclable materials.

Callaly are a UK based developer and manufacturer of menstrual period care products with an international sales footprint. The new project will utilise expertise and state-of-the-art equipment from the SMI Hub to find suitable alternatives to raw polymer materials.

The majority of period care products are designed to address the practical needs for the menstrual cycles and are often made from single-use plastics. Their properties and the excess of organic contamination makes recycling a significant challenge.

By utilising surplus materials from shellfish industries, researchers hope to develop bioplastics that can replace the stretchable films in feminine hygiene products. The properties of the bi-products offer a unique opportunity in developing functional films that are optically transparent, stretchable and have antimicrobial properties.

Commenting on the project, lead researcher and Kathleen Lonsdale Research Fellow in the Department of Materials, Dr Ahu Gumrah Parry said “We’re excited to be teaming up with Callaly on this project. Our efforts will unlock the potential of biopolymers as a biomedical material. Furthermore, using materials from shell fish farming waste streams to conduct this research helps us to enable a circular economy. Where disposal is necessary, such as feminine hygiene products, we want to ensure that the environmental impact is minimized by offering routes for biodegradable and compostable products”.

The funding will enable initial research into biodegradable components for Callaly’s award-winning Tampliner products.

Thang Vo-Ta, CEO & Co-Founder at Callaly said “We are delighted to receive this Engineering and Physical Sciences Research Council funding to test and develop exciting new materials that we can integrate into our products & bespoke manufacturing processes. As a B Corp, Callaly always holds ourselves to the highest standards and to be able to team up with the SMI Hub for greater sustainability in the period care market could make for very meaningful & positive impact”.

Source: University of Manchester




Technology.org

E-scooters as a new micro-mobility service

A new study by scientists from Future Urban Mobility (FM), an interdisciplinary research group at the Singapore-MIT Alliance for Research and Technology (SMART), MIT’s research enterprise in Singapore, has found that e-scooters, while considered by some to be a hazard to pedestrians and others, provide an important alternative mode of transit, particularly in urban areas. This study sheds light on the growing utility of e-scooters as a micro-mobility service in Singapore, and will also inform operators, planners, and policymakers on how best to harness and regulate this growing mode of mobility.

New research sheds light on the growing utility of e-scooters as a micro-mobility service in Singapore, and will also inform operators, planners, and policymakers on how best to harness and regulate this growing mode of mobility.

The study is described in a paper titled, “E-scooter sharing to serve short-distance transit trips: A Singapore case,” published recently in Transportation Research Part A: Policy and Practice. The work was led by Zhejing Cao, PhD researcher at Tsinghua University when she was a visiting SMART FM student, and co-authored by Jinhua Zhao, SMART FM lead principal investigator and associate professor in MIT’s Department of Urban Studies and Planning; as well as Xiaohu Zhang, an assistant professor at The University of Hong Kong; and Kelman Chua and Honghai Yu from Neuron, a Singapore e-scooter sharing operator.

Having first been introduced in Singapore around 2013, the e-scooter, also known in Singapore as a type of personal mobility device, grew swiftly in popularity as an affordable and convenient mobility alternative to driving on Singapore’s often-congested roads during peak hours. The number of e-scooter users has grown rapidly, swelling to around 100,000 registered e-scooters in Singapore by November 2019.

Their popularity, however, has not been problem-free, as a number of serious and even fatal accidents have involved e-scooters and their riders colliding with pedestrians. This ultimately led to a government-mandated ban on their use on footpaths across Singapore in November 2019.

Despite the footpath ban, e-scooters are currently a legal mode of mobility on some 440 kilometers of cycling paths island-wide, and remain in widespread use in Singapore today. E-scooter sharing services have also become micro-mobility services in many cities worldwide, including Singapore, with the emergence of companies such as Telepod and Neuron. E-scooter sharing services provide a convenient micro-mobility service to the public, with rental locations situated across the island and with e-scooters that can be unlocked and paid for via a smartphone app. In addition, e-scooters are an environmentally-friendly alternative to other transportation options via reduced carbon emissions, engender improved life quality and health, and offer mobility aid to the elderly and the disabled.

Lead author Zhejing Cao acknowledges safety concerns surrounding the use of e-scooters, but also notes that they bring compelling benefits to many. “The safety concerns around e-scooter use and the safety of all road and footpath users are of utmost importance,” she says. “Nevertheless, given the numerous benefits that e-scooters bring to the mobility and transport ecosystem, we hope that our work will help inform policymakers on facilitating the safe and regulated use of e-scooters as an emerging but important micro-mobility service.”

In the Singapore Central Area (SCA), despite the high accessibility of the mass rapid transit system island-wide, the ratio of rapid transit network distance to the shortest street path can be much higher than the global average. Furthermore, 20.98 percent of rapid transit trips in SCA have at least one transfer, higher than the average transfer level in Singapore, and passengers may have to walk an average of nearly 1 kilometer to enter and exit rapid transit stations. As a result, even if the origin and destination of a trip are in close geographical proximity, the overall traveling journey may be suboptimal, and could be made more efficient.

E-scooters can provide a valuable alternative in this segment. As such, it is valuable to explore and investigate the practicality and potential of using e-scooter sharing to replace certain short-distance transit trips where alternative transit methods may not be convenient.

In this study, the researchers collaborated with Neuron to explore the potential of using e-scooter sharing to replace short-distance transit trips in the SCA. The researchers conducted a stated preference survey of e-scooter users in the SCA and estimated mixed logistic models to examine factors influencing a user’s choice of e-scooter and transit. Based on this, the number of transit trips that can be replaced by e-scooters was calculated, and the researchers then analyzed decisions made by e-scooter companies in terms of the trade-offs between serving more e-scooter trips and generating more revenue under varying fares.

The researchers found that fare, transit transfer, and transit walking distance have significant negative impacts on mode utilization, with seemingly random choices among respondents. The uncertainty is higher in predicting e-scooter usage preferences of male, young, and high-income groups. In analyzing the travel demands under different levels of transit inconvenience, the researchers discovered that a higher level of transit indirectness, more transfers, and longer access-egress walking result in a higher average probability of using e-scooters as a mode of transit.

Through analyzing e-scooter companies’ decisions, the researchers also found that the revenue losses borne by e-scooter companies can be significant if e-scooter mode share is maximized with no regard for other considerations, and vice versa.

In order to achieve a better balance between these two competing goals, the researchers found the optimal trade-off places in between two maximization extremes, thus finding the sweet spot where a small sacrifice in maximizing one goal can prevent great loss in the other.

“E-scooter sharing services have shown enormous potential to become an important component of transit systems in urban environments in Singapore and other cities worldwide,” Cao says. “Our study has highlighted the shortcomings of public transport in serving short-distance journeys in the SCA. E-scooter sharing services are able to bridge this gap and provide a convenient micro-mobility service to the public.”

Co-author Xiaohu Zhang adds, “E-scooter sharing as a new form of micro-mobility will improve the overall efficiency of urban transportation systems through enhancing last-mile connectivity as well as serving short-distance travels. It also has huge potential in the future if powered by autonomous driving technology.”

The findings of the SMART study can be used to inform operators, planners, and policymakers on how to harness and regulate this new mobility service, as well as provide suggestions on deploying shared e-scooters to satisfy demand unmet by transit, especially where transit travel involves greater indirectness, transfer, and access-egress walking distance. E-scooter supply strategies at different locations can be varied according to various socio-demographic factors that influence e-scooter preference and mode choices.

When public authorities and private operators take conflicting positions on whether to serve more individual trips or generate greater revenue, the trade-off can be gauged to achieve balance. Such possible means of mitigating disparity between the two goals and achieving balance include administrative regulations (e.g., requiring operators to serve inconvenient short transit trips at certain designated locations) or economic interventions (e.g., subsidies to operators provided by public authorities).

Jinhua Zhao adds, “Evidently, e-scooters present unique advantages and challenges for regulators and policymakers. If managed and regulated effectively, e-scooter sharing services can play an important role in the public mobility circuit, filling a gap in the short-distance transit segment that public transport is, as yet, often unable to fill.”

The research is carried out by SMART and supported by the National Research Foundation (NRF) Singapore under its Campus for Research Excellence And Technological Enterprise (CREATE) program.

FM is one of five interdisciplinary research groups in SMART. FM harnesses new technological and institutional innovations to create the next generation of urban mobility systems to increase accessibility, equity, safety, and environmental performance for the citizens and businesses of Singapore and other metropolitan areas worldwide. SMART-FM is supported by the NRF and situated in CREATE.

Written by Singapore-MIT Alliance for Research and Technology

Source: Massachusetts Institute of Technology




Technology.org

Finding the love hormone in a stressed-out world

In Jenna Sutela’s work, which ranges from computational poetry to experimental music to installations and performance, the MIT Center for Art, Science & Technology (CAST) Visiting Artist enlists microbes and neural networks as co-creators.

“I want to explore this notion of expanded authorship through bringing in beyond-human life forms,” Sutela says. Inspired by science fiction, she employs both nature’s oldest technologies — the slime model Physarum polycephalum that has been compared to a computer — and the newest ones developed in research labs. Bacteria and artificial intelligence are among her many collaborators in creating artworks that challenge the deeply ingrained idea that humans exist apart from the teeming, vibrating world that contains us.

Video still from “Wet-on-Wet”, 2021, HD video, sound, 7’04’’. Image credit: Jenna Sutela and Markus Buehler

In April, Sutela participated in an Open Systems panel, moderated by Caroline Jones, professor of history theory and criticism, as part of this year’s CAST symposium. “Jenna Sutela operates in the fluid spaces of artistic knowing, bridging vastly separated topics from the ‘distributed intelligence’ of slime mold to the ‘alien intelligence’ flowing into a Victorian trance medium,” Jones notes, “When I proposed her for a visiting artist residency, I knew she would thrive in the edgy MIT research labs.”

As a visiting artist, Sutela was inspired by the sonifications of Jerry McAfee (1940) Professor of Engineering Markus Buehler, which are created by translating the vibrations of protein chains into audible sound. This is a field that she has been following attentively: different (still niche) scientific practices of observing life by listening instead of only looking.

As part of this research, Buehler had recently sonified the molecular structure of the coronavirus. “Not only can phenomena within the structure of materials — such as the motion or folding of molecules — be heard and open a new way to understand nature, but it also expands our palette of musical composition,” Buehler says. “When used reversibly, they yield a systematic approach to design new matter, such as new protein molecules that emerge from this process, complementing what evolution produces.”

“A lot of my work consists of using microscopes and telescopes to communicate things that are beyond our ability to experience firsthand,” Sutela says. She has given a voice, or a hum, to Bacillus subtilis bacteria that thrive both in our guts and in outer space. With the pandemic introducing a profound new global anxiety, Sutela wondered if the surge of chemicals that cause emotions such as love or bonding — what are referred to as “emotive molecules” in the project — could similarly be translated into perceptible form.

Seeing like a machine

Meanwhile, Buehler had spent the past decade listening to proteins and using them as “instruments” and a source for auditory compositions. He had only lately turned to making these molecular patterns comprehensible to another human sense: sight. By hooking an actuator up to a petri dish of water, he was able to see how molecular vibrations manifested as visible water waves. “But when I looked at all these different patterns from different proteins and mutations and I couldn’t really clearly distinguish them, I thought, ‘Maybe a machine learning algorithm might be able to do that, and help in the cross-domain translation,” he recalled.

The computer, with its artificial neural network, became a creative collaborator. “The computer has now understood the mechanisms of these vibrations and how they relate to different proteins, or molecules. Then I can actually take an image, and ask the algorithm, ‘What do you see in this picture?’” says Buehler. The computer then “draws” on the image, superimposing the patterns it detects on top of the picture to an almost psychedelic effect. In a photo taken of a recent trip to the seaside (like so many in quarantine, Buehler had found himself spending more time outdoors), he discovered the computer could pick out the invisible molecular patterns of the ocean and craggy rock.

This resonates with Sutela’s earlier machine learning-based work that has sought to “get in touch with the nonhuman condition of the computers that work as our interlocutors or infrastructure, or the computers even getting in touch with the more-than-human world around them.”

The oceanic

Could the computer detect a molecule of emotion? In short, could it see love? Buehler and postdoc Kai Guo from his lab at MIT conducted molecular dynamics modeling of the chemical structure of oxytocin, the hormone and neurotransmitter that is involved in childbirth and breast-feeding. He then translated this structure into vibrations, and taught the computer how to recognize them. “The human inspiration came in through Jenna,” he says.

She first emailed him a video of a quavering jellyfish, its translucent body indistinguishable from the surrounding sea. Then she began sending videos of wet-on-wet watercolor paintings that she had made as a form of lockdown meditation. This technique embraces unpredictability by letting the flow of water determine the shapes on the wet paper. The sense of calm she experienced was subsequently reflected by the algorithm as it traced the forms of the neurotransmitters and other emotive molecules over the moving images.

Kai Guo created the molecular dynamics simulations shown in the video. “So, now you have the transcendence between the scales: the molecular, the quantum scale, to the audible scale to the visual scale, and then to the human,” says Buehler.

The video, titled “Wet-on-Wet,” will debut in an online exhibition, Survivance, organized by the Guggenheim Museum and the publishing platform e-flux. Sutela believes the idea of water, connecting humans to each other and the wider environment, dislodges assumptions about individualism. “There’s this idea of oceanic feeling, a sense of oneness with the world, or this kind of limitlessness that’s triggered by the oxytocin hormone,” Sutela says. “When talking about the oceanic, I’d like to focus on not just a feeling, but also work towards our responsibilities as part of both the ecosystem and the society.”

Finding a universal language

The way the SARS-Cov-2 virus has radically transformed the organization of our lives is evidence enough of the agency of nonhuman matter, and the ways in which the animate and inanimate are deeply enmeshed. “Wet-on-Wet” is, in a sense, an empathic overture to this more-than-human world, an attempt to find a common language in the form of waves, despite the limitations of our human senses.

The universe, we know, is always in motion, and each of us is vibrating matter. Sutela and Buehler’s work reminds us of our oneness based on this simple physical fact. As Caroline Jones notes, Sutela “helps us see the world as offering infinite kinship.”

Being able to visualize molecular vibrations may lead us to a greater appreciation of our interconnectedness across species, adds Buehler. The patterns of molecules that comprise a human body, after all, are similar to the patterns that might make up a rock, a jellyfish, or a piece of slime mould. “We live on a symbiotic planet,” says Sutela, “we’re part and parcel.”

Written by Anya Ventura

Source: Massachusetts Institute of Technology




Technology.org

Patenting a fibre optic monitoring system for 5G light-powered networks

The Universidad Carlos III de Madrid (UC3M), together with the Universidad Politécnica de Valencia (UPV), has patented a multicore fibre optic monitoring system for future use in 5G networks. This system will optimise energy consumption, preserving data transmission capacity.

The system, developed by the UC3M’s Photonic Displays and Applications research group, has been able to light-power a system for controlling turning antennas on and off using a fibre optic infrastructure. “What we are going to achieve is a parallel system that will monitor the node’s energy needs at all times. In other words, if there is no user in the cell, which is the physical area covered by a particular antenna, we will turn it off so that it is not consuming energy,” says Carmen Vázquez, professor at the Department of Electronic Technology.

In addition to this, by receiving a single optical signal, the system can also monitor temperature changes in the fibre core, energy distribution using optical means at different network points, and the state of the communication channel used within the fibre. “If lots of energy is sent, the temperature inside the fibre might increase and, therefore, could be damaged. This system helps us know how much energy we are sending and make sure that the infrastructure we are using to send that energy is in good condition and we are not damaging it,” notes Vázquez.

The system can also be integrated into the communications channel itself, with minimal insertion losses and monitoring on a different control channel to the channel being used to send energy. Currently, there is no commercial system that integrates this type of technique, according to the research team.

This patent has been created in collaboration with the ITEAM-UPV’s Photonics Research Labs, who manufactured the semi-reflective mirrors embedded in the optical fibres. “Fibre-manufactured devices monitor the power reaching the nodes in real-time, while indicating the temperature, without affecting the power of the data being transmitted. This is the basis for the technique developed by the UC3M group,” notes Salvador Sales, professor and researcher at the ITEAM-UPV.

The results of research published recently in the Journal of Lightwave Technology scientific journal, which is co-edited by the Optical Society of America (OSA) and the IEEE Photonics Society, show some of the applications that the developed invention may have.

This patent has been developed within the framework of a wider line of research, which has obtained a set of results. BlueSPACE (5G PPP BlueSpace Project Grant 762055) is a three-year European research project, led by Eindhoven University of Technology, that aims to develop next-generation wireless technologies. BlueSpace aims to contribute technologies to increase the speed of the current network, while seeking to reduce energy consumption by using centralised technologies and multicore fibres. The UC3M’s contributions to remote light-power have been evaluated in order to be part of the innovative technologies funded by the European Union and of Innovation Radar, an initiative from the European Commission.

Source: Universidad Carlos III de Madrid




Technology.org

Dark matter: ‘real stuff’ or gravity misunderstood?

For many years now, astronomers and physicists have been in a conflict. Is the mysterious dark matter that we observe deep in the Universe real, or is what we see the result of subtle deviations from the laws of gravity as we know them? In 2016, Dutch physicist Erik Verlinde proposed a theory of the second kind: emergent gravity. New research, published in Astronomy & Astrophysics this week, pushes the limits of dark matter observations to the unknown outer regions of galaxies, and in doing so re-evaluates several dark matter models and alternative theories of gravity. Measurements of the gravity of 259,000 isolated galaxies show a very close relation between the contributions of dark matter and those of ordinary matter, as predicted in Verlinde’s theory of emergent gravity and an alternative model called Modified Newtonian Dynamics. However, the results also appear to agree with a computer simulation of the Universe that assumes that dark matter is ‘real stuff’.

In the centre of the image the elliptical galaxy NGC5982, and to the right the spiral galaxy NGC5985. These two types of galaxies turn out to behave very differently when it comes to the extra gravity – and therefore possibly the dark matter – in their outer regions. Image credit: Bart Delsaert (www.delsaert.com).

The new research was carried out by an international team of astronomers, led by Margot Brouwer (RUG and UvA).  Further important roles were played by Kyle Oman (RUG and Durham University) and Edwin Valentijn (RUG). In 2016, Brouwer also performed a first test of Verlinde’s ideas; this time, Verlinde himself also joined the research team.

Matter or gravity?

So far, dark matter has never been observed directly – hence the name. What astronomers observe in the night sky are the consequences of matter that is potentially present: bending of starlight, stars that move faster than expected, and even effects on the motion of entire galaxies. Without a doubt all of these effects are caused by gravity, but the question is: are we truly observing additional gravity, caused by invisible matter, or are the laws of gravity themselves the thing that we haven’t fully understood yet?

To answer this question, the new research uses a similar method to the one used in the original test in 2016. Brouwer and her colleagues make use of an ongoing series of photographic measurements that started ten years ago: the KiloDegree Survey (KiDS), performed using ESO’s VLT Survey Telescope in Chili. In these observations one measures how starlight from far away galaxies is bent by gravity on its way to our telescopes. Whereas in 2016 the measurements of such ‘lens effects’ only covered an area of about 180 square degrees on the night sky, in the mean time this has been extended to about 1000 square degrees – allowing the researchers to measure the distribution of gravity in around a million different galaxies.

Comparative testing

Brouwer and her colleagues selected over 259,000 isolated galaxies, for which they were able to measure the so-called ‘Radial Acceleration Relation’ (RAR). This RAR compares the amount of gravity expected based on the visible matter in the galaxy, to the amount of gravity that is actually present – in other words: the result shows how much ‘extra’ gravity there is, in addition to that due to normal matter. Until now, the amount of extra gravity had only been determined in the outer regions of galaxies by observing the motions of stars, and in a region about five times larger by measuring the rotational velocity of cold gas. Using the lensing effects of gravity, the researchers were now able to determine the RAR at gravitational strengths which were one hundred times smaller, allowing them to penetrate much deeper into the regions far outside the individual galaxies.

This made it possible to measure the extra gravity extremely precisely – but is this gravity the result of invisible dark matter, or do we need to improve our understanding of gravity itself? Author Kyle Oman indicates that the assumption of ‘real stuff’ at least partially appears to work: “In our research, we compare the measurements to four different theoretical models: two that assume the existence of dark matter and form the base of computer simulations of our universe, and two that modify the laws of gravity – Erik Verlinde’s model of emergent gravity and the so-called ‘Modified Newtonian Dynamics’ or MOND. One of the two dark matter simulations, MICE, makes predictions that match our measurements very nicely. It came as a surprise to us that the other simulation, BAHAMAS, led to very different predictions. That the predictions of the two models differed at all was already surprising, since the models are so similar. But moreover, we would have expected that if a difference would show up, BAHAMAS was going to perform best. BAHAMAS is a much more detailed model than MICE, approaching our current understanding of how galaxies form in a universe with dark matter much closer. Still, MICE performs better if we compare its predictions to our measurements. In the future, based on our findings, we want to further investigate what causes the differences between the simulations.”

Young and old galaxies

Thus it seems that, at least one dark matter model does appear to work. However, the alternative models of gravity also predict the measured RAR. A standoff, it seems – so how do we find out which model is correct? Margot Brouwer, who led the research team, continues: “Based on our tests, our original conclusion was that the two alternative gravity models and MICE matched the observations reasonably well. However, the most exciting part was yet to come: because we had access to over 259,000 galaxies, we could divide them into several types – relatively young, blue spiral galaxies versus relatively old, red elliptical galaxies.” Those two types of galaxies come about in very different ways: red elliptical galaxies form when different galaxies interact, for example when two blue spiral galaxies pass by each other closely, or even collide. As a result, the expectation within the particle theory of dark matter is that the ratio between regular and dark matter in the different types of galaxies can vary. Models such as Verlinde’s theory and MOND on the other hand do not make use of dark matter particles, and therefore predict a fixed ratio between the expected and measured gravity in the two types of galaxies – that is, independent of their type. Brouwer: “We discovered that the RARs for the two types of galaxies differed significantly. That would be a strong hint towards the existence of dark matter as a particle.”

A plot showing the Radial Acceleration Relation (RAR). The background is an image of the elliptical galaxy M87, showing the distance to the centre of the galaxy. The plot shows how the measurements range from high gravitational acceleration in the centre of the galaxy, to low gravitational acceleration in the far outer regions. Image credit: Chris Mihos (Case Western Reserve University) / ESO.

However, there is a caveat: gas. Many galaxies are probably surrounded by a diffuse cloud of hot gas, which is very difficult to observe. If it were the case that there is hardly any gas around young blue spiral galaxies, but that old red elliptical galaxies live in a large cloud of gas – of roughly the same mass as the stars themselves – then that could explain the difference in the RAR between the two types. To reach a final judgement on the measured difference, one would therefore also need to measure the amounts of diffuse gas – and this is exactly what is not possible using the KiDS telescopes. Other measurements have been done for a small group of around one hundred galaxies, and these measurements indeed found more gas around elliptical galaxies, but it is still unclear how representative those measurements are for the 259,000 galaxies that were studied in the current research.

Dark matter for the win?

If it turns out that extra gas cannot explain the difference between the two types of galaxies, then the results of the measurements are easier to understand in terms of dark matter particles than in terms of alternative models of gravity. But even then, the matter is not settled yet. While the measured differences are hard to explain using MOND, Erik Verlinde still sees a way out for his own model. Verlinde: “My current model only applies to static, isolated, spherical galaxies, so it cannot be expected to distinguish the different types of galaxies. I view these results as a challenge and inspiration to develop an asymmetric, dynamical version of my theory, in which galaxies with a different shape and history can have a different amount of ‘apparent dark matter’.”

Therefore, even after the new measurements, the dispute between dark matter and alternative gravity theories is not settled yet. Still, the new results are a major step forward: if the measured difference in gravity between the two types of galaxies is correct, then the ultimate model, whichever one that is, will have to be precise enough to explain this difference. This means in particular that many existing models can be discarded, which considerably thins out the landscape of possible explanations. On top of that, the new research shows that systematic measurements of the hot gas around galaxies are necessary. Edwin Valentijn formulates is as follows: “As observational astronomers, we have reached the point where we are able to measure the extra gravity around galaxies more precisely than we can measure the amount of visible matter. The counterintuitive conclusion is that we must first measure the presence of ordinary matter in the form of hot gas around galaxies, before future telescopes such as Euclid can finally solve the mystery of dark matter.”

Source: University of Amsterdam



Technology.org

To find out how galaxies grow, we’re zooming in on the night sky and capturing cosmic explosions

These images are transferred across the Pacific to be processed on Swinburne’s OzStar supercomputer — which is more powerful than 10,000 personal laptops and can handle thousands of different jobs at once.

Once uploaded, the images are broken down into smaller chunks. This is when we start to see details.

Pictured are some of the galaxies visible within smaller cutouts of data sent to the DWF program from the Blanco 4m. Image credit: Sara Webb

But the galaxies above, spectacular as they are, still aren’t what we’re looking for. We want to capture new “sources” resulting from dying stars and cosmic explosions, which we can identify by having our computers search for light in places it wasn’t previously detected.

A source could be many different things including a flaring star, a dying star or an asteroid. To find out we have to collect continuous information about its brightness and the different wavelengths of light it emits, such as radio, x-ray, gamma-ray and so forth.

To the left is an old image of a patch of sky and to the right is a updated image with a new source having just occurred. This one is likely a flare star or an asteroid. Image credit: Sara Webb

Once we spot a source, we monitor changes in its brightness over the coming hours and days. If we think it may represent a rare cosmic explosion, we trigger other telescopes to collect additional data.

Peering into the distant past

Galaxies are vast collections of stars, gas, dust and dark matter. They vary in shape, size and colour, but the two main types we see in the universe today are blue spirals and red ellipticals. But how do they form? And why are there different types?

Astronomers know the shapes and colours of a galaxy are linked to its evolution, but they’re still trying figure out exactly which shapes and colours are linked to specific growth pathways.

We think galaxies grow in size and mass through two main channels. They produce stars when their vast hydrogen clouds collapse under gravity. As more gas is transformed into stars, they grow in size.

Thanks to space-based technology such as the Hubble Space Telescope and powerful on-ground telescopes, astronomers can now peer back in time to study galaxy growth over the history of the universe.

This is possible since the further away a galaxy is, the longer its light travelled to reach us. Because the speed of light is constant, we can determine when the light was emitted — as long as we know the galaxy’s distance from Earth (called its “redshift”).

I measured this growth as part of my PhD, by taking images of galaxies that exist at different redshifts from as far back as when the universe was only one billion years old, and comparing their sizes.

A selection of distant galaxies spotted in my study of galaxy growth over time. These appear very different to nearby galaxies. Image credit: Rebecca Allen

When galaxies merge

Looking around the universe today, we mostly see galaxies clustered together. Astronomers believe the nature of a galaxy’s surroundings or its environment can affect its growth pathways, similar to how people in large cities can access more resources than those in rural areas.

When many galaxies are grouped together they may interact. And this interaction can stimulate bursts of star formation within a particular galaxy.

That said, this growth spurt may be short-lived, as gas and stars can be stripped away through the gravitational interaction between multiple galaxies, thereby limiting future star formation and growth in a single galaxy.

This image was captured using the Hubble Space Telescope. It shows a group of spiral galaxies, which astronomers can clearly determine due to the high resolution of the image. Image credit: Rebecca Allen

But even if a galaxy can’t form stars, it can still grow by merging with or consuming smaller galaxies. For example, the Milky Way will one day consume the smaller Magellanic clouds, which are dwarf galaxies. It will also merge with the slightly larger Andromeda galaxy one day, to form one giant galaxy.

Yet, while many studies have been conducted unpack galaxy evolution, we still can’t say all our questions have been answered.

It took billions of years for the galaxy clusters we observe today to form. But if astronomers can leverage the latest technologies and peer further into the distance than ever before, we will hopefully gain clues about how a galaxy’s environment can impact its growth.

Pictured are two groups of distant galaxies that existed when the universe was one-quarter of its current age. These galaxy groups will eventually come together and form a structure similar to the Virgo cluster. I have studied them both to learn more about how the galaxies within them are growing. Image credit: Rebecca Allen

The bending of spacetime reveals secrets

With decades of observations and millions of galaxies captured in surveys, experts have many theories regarding how galaxies form, and how the universe evolves. This field is called cosmology.

Thanks to Albert Einstein, we know the gravitational force of massive objects in space causes space to bend. This has been observed through a phenomena known as “lensing”, where vast amounts of matter are concentrated in one area within objects such as black holes, galaxies or galaxy clusters.

Their gravity distorts spacetime, acting as a giant lens to reveal warped images of more distant objects behind them. Using lensing, astronomers have developed ways to find and study distant galaxies that would otherwise be hidden from view.

A set of galaxy-galaxy lenses. The massive foreground galaxy’s gravity distorts spacetime, acting as a lens that reveals a warped image of a distant background galaxy. Image credit: Rebecca Allen

These observations continue to drive our understanding of galaxy evolution. They’re challenging our theories of when and how galaxies form and grow.

One 2018 discovery made by a group of researchers, including myself, revealed a set of massive and already evolved galaxies from when the universe was only about one-sixth of its current age. They would have had to form and grow extremely rapidly to fit our current models of galaxy growth.

One of the massive quiescent galaxies which our team will investigate. While extremely large, its older stars and distance make it appear as a tiny red nugget among the much brighter and closer galaxies. Image credit: Rebecca Allen, Author provided

In an upcoming investigation, Swinburne Professor Karl Glazebrook will lead my team and me to become some of the first astronomers granted access to Nasa’s James Webb Space Telescope to study these early galaxies.



Technology.org

Engineering nanobodies as lifesavers when SARS-CoV-2 variants attack

Scientists are pursuing a new strategy in the protracted fight against the SARS-CoV-2 virus by engineering nanobodies that can neutralize virus variants in two different ways.

In lab studies, researchers identified two groups of molecules that were effective against virus variants. Using different mechanisms, nanobodies in each group bypassed mutations and disabled the virus’s ability to bind to the receptor that lets it enter host cells.

Though vaccination is enabling the resumption of some pre-pandemic activities in parts of the world, SARS-CoV-2 is rapidly working its way around vaccines by mutating itself. In this study, the nanobodies neutralized three emerging variants: Alpha, Beta and Gamma.

“Companies have already started introducing the variants of concern into the construct of booster shots of the existing vaccines,” said Kai Xu, assistant professor of veterinary biosciences at The Ohio State University and a co-lead author of the research. “But the virus is constantly mutating, and the speed of mutation may be faster than we can capture. Therefore, we need to utilize multiple mechanisms to control the virus spread.”

An accelerated article preview of the study is published online in Nature.

Nanobodies are antibodies derived from the immunization of camelid mammals – such as camels, llamas and alpacas – that can be re-designed into tiny molecules that mimic human antibody structures and functions.

For this work, the researchers immunized llamas to produce single-chain antibodies against SARS-CoV-2. They also immunized “nanomice,” transgenic mice with a camelid gene that had been engineered by research fellow Jianliang Xu in the lab of Rafael Casellas, senior investigator at the National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS), to generate nanobodies similar to those produced by camelids.

The team enhanced the nanobodies’ power by immunizing the animals first with the receptor binding domain (RBD), a part of the viral surface spike protein, and following with booster shots containing the entire spike protein.

“By using this sequential immunization strategy, we generated nanobodies that can capture the virion by recognizing the receptor binding domain with very high affinity,” Xu said.

The scientists tested different nanobodies’ neutralization capacity, mapping the surface of the RBD, conducting functional and structure analyses, and measuring the strength of their affinity to narrow the candidate molecules from a large library to six.

The coronavirus is highly infectious because it binds very tightly to the ACE2 receptor to gain access to lung and nasal cavity cells in humans, where it makes copies of itself to infect other cells. The receptor-binding domain on the spike protein is fundamental to its success in attaching to ACE2.

“That RBD-ACE2 interface is on the top of the receptor-binding domain – that region is the primary target for the protective human antibodies, generated by vaccination or previous infection, to block the viral entry,” Xu said. “But it is also a region frequently mutated in the variants.”

The way mutants have emerged so far suggests long-term reliance on current vaccines will eventually be compromised, the researchers say, because antibody effectiveness is affected significantly by those mutants at the interface.

“We found that certain nanobodies can recognize a conserved region of the receptor-binding domain, a hidden location that is too narrow for human antibodies to reach,” Xu said. And attaching at this location, even though it is some distance away from where RBD connects to ACE2, still accomplishes what is intended – blocking SARS-CoV-2 from entering a host cell.

The other group of nanobodies, attracted to the RBD-ACE2 interface, while in their original form could not neutralize certain variants. However, when the researchers engineered this group to be homotrimers – three copies linked in tandem – the nanobodies achieved potent neutralization of the virus. Altering the structure of the nanobodies that attached to the conserved region of RBD, in the same way, enhanced their effectiveness as well.

There is much more research ahead, but the findings suggest nanobodies could be promising tools to prevent COVID-19 mortality when vaccines are compromised, Xu said.

“Our future plan is to further isolate antibodies specifically against emerging variants for therapeutic development, and to find a better solution for vaccines by learning from those antibodies,” he said.

Source: Ohio State University




Technology.org