Tag Archives: internet

Researchers develop underwater WiFi

The technology could enable divers to send information to the surface reliably and quickly. It’s quite cheap, too.

Give me a Raspberry Pi and I’ll build anything. Credit: KAUST; Xavier Pita

The internet has become an indispensable tool, becoming essentially a human right. But while the internet has penetrated to some of the farthest corners of the world, there’s still one place it hasn’t yet reached: under water.

If you’re a diver or a marine researcher or explorer and want to send information from beneath the waves to the surface, you have three options: radio, acoustic and visible light signals. However, all these options come with significant drawbacks. For radio, data can only be carried over short distances, for acoustic the transmission speed is very slow, and for visible light, you need a clear path between the transmitter and receiver. If you wanted to have the best of all worlds, there was no possible option — until now.

A team of researchers has developed a system for transmitting wifi under water, using lasers and LEDs.

“People from both academia and industry want to monitor and explore underwater environments in detail,” explains the first author, Basem Shihada.

The system, called Aqua-Fi, does not require any additional underwater infrastructure as it can operate using self-contained batteries. It also uses standard communication protocols, which means that it can communicate with other systems with relative ease.

Aqua-Fi uses radio waves to send data from a diver’s smartphone to a “gateway” device — the Raspberry Pi, the classic single-board computer used in engineering projects all around the world. The device then sends the data via a light beam to a computer at the surface

The researchers tested the system by simultaneously uploading and downloading multimedia from computers a few meters apart. The maximum speed they achieved is 2.11 megabytes per second, with an average delay of only 1 millisecond for a round trip.

To make matters even better, the whole system is cheap and relatively easy to set up.

“We have created a relatively cheap and flexible way to connect underwater environments to the global internet,” says Shihada. “We hope that one day, Aqua-Fi will be as widely used underwater as WiFi is above water.”

“This is the first time anyone has used the internet underwater completely wirelessly,” says Shihada.

However, this is more a proof of concept than anything else. The system used basic electronic components, and researchers want to improve its quality using faster components. They also need to ensure that the light beam remains perfectly aligned with the receiver in moving waters.

So it will still be a while before Aqua-Fi becomes publicly available, but it’s getting there, the team concludes.

Journal Reference: Basem Shihada et al. Aqua-Fi: Delivering Internet Underwater Using Wireless Optical Networks, IEEE Communications Magazine (2020). DOI: 10.1109/MCOM.001.2000009

Virtual private networks grow popular under the lockdown

The coronavirus outbreak forced us all into the house — and thus dramatically increased the value of our internet connections.

An internet router.
Image in the Public Domain.

With our online activity rising to such prominence in our lives, many people are looking to protect both their privacy and data. Virtual private networks seem to be one of the more popular choices for doing so. Demand for these ‘VPNs’ in April was between 22% to 36% higher than pre-pandemic levels, companies report. Demand peaked in late March with an increase of 65% above previous levels.

Private browsing

“Online searches for VPN began to surge around the world in mid-March in the days following the World Health Organization’s declaration of a pandemic on March 11,” writes Simon Migliano, a Digital Privacy & Censorship expert for CNet. “We’ve seen demand suddenly double in countries where lockdowns have been announced or expected.”

VPNs work by extending a private network, such as one you’d set up with a few cables between two computers in your home, over the internet. Think of it as a ‘residents only’ lane of the internet. They’re called ‘virutal’ because they aren’t exactly private networks, but use the (public) architecture of the Internet as we know it to create networks that function like private ones.

Machines that are part of this network enjoy a greater degree of privacy and security of data. Robust encryption is one of the key features of popular VPNs like Atlas VPN.

Other popular uses of VPNs are as smokescreens to protect a users’ identity and real-life location (through the use of a proxy server), usually in order to bypass georestrictions, or as a means of avoiding censorship or surveillance.

All in all, powerful tools — but not something that the vast majority of people would have given mich thought to a few months back.

They have gained major interest during the lockdown, however. Both business and private user demand have increased in 75 countries, with Egypt showing the greatest increase, of 224%. Other companies have reported a 36% increase in global users from February to March.

Work, education, entertainment, and social events are all increasingly moving online. Internet traffic as a whole shot up by 50% in the week of March 23, after restrictions were set in place to curb the outbreak in the EU and US. The more time we spend digitally, the more important issues such as net neutrality and access become as well. 

The Internet is dealing well with growing coronavirus traffic

With most people staying at home across the world due to the coronavirus pandemic, a big increase in internet traffic has been registered, leading to worries about the resilience of the infrastructure. These fears appear to be unfounded, according to a new report.

Credit Flickr

The network equipment maker Nokia looked at usage data from Nokia Deepfield Network Analytics report, specifically for the week of March 23, just after stringent restrictions were imposed across Europe and the United States to try and curb the spread of the new coronavirus.

Web traffic spiked by up to 50% immediately after millions of people were directed to stay at home, sparking fears that networks would not be able to cope with the surge. But since then, “we have seen the stabilization of peak demand, with manageable traffic growth,” the report by Nokia’s Deepfield analytics arm said. Workers are turning to data-heavy video conferencing apps, which create a lot of internet traffic across all regions. Nevertheless, networks are handling the increase well.

Verizon, Cox, and AT&T built more cell sites to strengthen mobile networks, also increasing the number of fiber connections on their network backbones, and upgrading the routing and switching technology that lets devices talk to one another and share an internet connection.

In Europe, Orange has doubled the capacity of its undersea internet cables. In Italy, where home internet use is up 90%, Telecom Italia said its technicians continued to make repairs and add capacity. Vodafone said it had increased its capacity 50% in recent weeks through a mix of software solutions.

Weekday online traffic surged in the mornings after the movement restrictions were imposed, but fluctuated less than before the lockdowns, report author Craig Labovitz said. “There are no afternoon lags that were the result of the afternoon commutes from workplaces to homes,” he added.

For example, the video conferencing app Zoom — which has emerged as a favorite way for many workers, families, and friends to stay in touch online— has seen its traffic increase as much as 700% on some US networks, the report said. Video streaming also saw a big hike, with Disney+ responsible for 18% of video streaming traffic. The report pointed out that Disney+ does not appear to have followed other video-on-demand services such as Netflix, YouTube, and Amazon Prime in reducing their streaming quality in Europe to allow for the increased usage without increasing bandwidth requirements.

Although infrastructure has enough capacity to deal with the coronavirus-triggered surge in demand for now, Dexter Thillien, a senior industry analyst at Fitch Solutions, told CNBC there could be trouble ahead.

The big uncertainty going forward, he says, is not knowing how long the pandemic — and the nationwide shutdowns it has caused — will last. If engineers are required to self-isolate, for instance, this may make it harder for telecommunications companies to maintain the copper and fiber cables and other equipment needed.

Internet access should be a basic human right, study says

In the digital age, not having internet access comes at a huge opportunity loss.
Credit: Pixabay.

A scholar in ethics argues that internet access should be considered a basic human right, similarly to the global right to health and liberty. His study suggests that in places where humans lack the means of getting online, other basic rights can be undermined.

“Internet access is no luxury, but instead a moral human right and everyone should have unmonitored and uncensored access to this global medium – provided free of charge for those unable to afford it,” Dr. Merten Reglitz, a lecturer in global ethics at the University of Birmingham in the UK.

“Without such access, many people lack a meaningful way to influence and hold accountable supranational rule-makers and institutions. These individuals simply don’t have a say in the making of the rules they must obey and which shape their life chances.”

Reglitz’s research found that exercising free speech and obtaining information in these modern times is unavoidably linked to internet access. During elections, for instance, social media channels are flooded with political debates and politically-orientated content shared across a person’s network. By not having access to such information, and not being able to participate in the debate, an “offline” person will have relatively lower freedom of speech.

In order to highlight the importance of the internet in shaping the modern political and socio-cultural landscape, Reglitz cites several recent examples in which online campaigns and movements have led to wide-scale change. Among them, Reglitz mentions the ‘Arab Spring’ (people would use the internet to report on government atrocities), the documentation of unjustified police violence against blacks in the United States, or the viral #MeToo campaign, which tackled top-level sexual harassment of women by men in positions of authority.

Reglitz argues that other basic human rights such as life, liberty, and freedom from torture can be much more effectively protected as long as there is internet access. In his opinion, internet access is of such fundamental importance that if a nation is unwilling or unable to provide its citizens with this right, then this should be a call for the international community to step in.

Some nations and organizations have already taken the initiative in this respect. Kerala, a state in India that is home to 35 million people, has declared universal internet access a human right and is aiming to uphold this value by the end of 2019. Meanwhile, the EU plans to provide every European city and village with free WiFi around the main centers of public life by 2020. Global internet access is also part of the United Nations’ Sustainable Development Goals.

It’s believed that 51% of the world’s population of 7 billion people has access to the internet. There has been tremendous growth during the past decade thanks to the proliferation of cheap smartphones and advances in telecom infrastructure. However, there is still much to do in order to help the other half of the world’s population to catch up.

The challenge lies in financing such an undertaking — but this is definitely doable. After all, it’s not like everyone needs the fastest internet and latest devices. A steady, low-broadband connection on a low-cost smartphone is more than enough for basic internet access and communication.

“Universal internet access need not cost the earth – accessing politically important opportunities such as blogging, obtaining information, joining virtual groups, or sending and receiving emails does not require the latest information technology,” commented Dr. Reglitz.

“Web-capable phones allow people to access these services and public internet provision, such as public libraries, can help get people online where individual domestic access is initially too expensive.”

One huge leap in enabling universal internet access could be made if current initiatives to provide satellite internet are successful. SpaceX plans is to launch a constellation of 12,000 high-speed internet satellites around the Earth, which could provide data transfers up to 50% faster than existing fiber-optic cables. The goal is to have this blanket of satellites operational by 2027, which would provide internet to even some of the most remote locations in the world.

Whether its conventional telecom or satellite-based internet, the aim should be to drastically improve broadband affordability. According to the World Wide Web Foundation, internet access is considered affordable as long as one gigabyte of data doesn’t cost more than 2% of a person’s monthly income. Today, 2.3 billion people cannot afford internet access — and this is simply not acceptable in the digital age. In many respects, a lack of internet access today has a similar opportunity burden as not having electricity a century ago.

Imagine the kind of advances society could make if everyone had access to the internet. Broadband connectivity carries the unprecedented potential to bridge education divides, transform learning and improve skills for the globalized economy. A student in a developing country can access the library of a prestigious university anywhere in the world, an unemployed person can retrain and improve their job prospects in other fields, and individuals can found digital startups that enhance the local economy and provide jobs.

The study was published in the Journal of Applied Philosophy.

We’re 50 km closer to quantum internet

A team of researchers based in Innsbruck reports sending light entangled with quantum information over a 50-kilometer-long stretch of optic fiber.

Image credits Joshua Kimsey.

The study comes as a collaboration between members at the Department of Experimental Physics at the University of Innsbruck and at the Institute of Quantum Optics and Quantum Information of the Austrian Academy of Sciences. They report setting the longest record for the transfer of quantum entanglement between matter and light.

Such results pave the way for long-range quantum communication, which would enable transfer between different cities, for example. Effectively, the early stages of quantum bit Internet.

Lasers and crystals

“[50 km] is two orders of magnitude further than was previously possible and is a practical distance to start building inter-city quantum networks,” says Ben Lanyon, the Ph.D. and experimental physicist at the Austrian Academy of Sciences who led the research.

One of the most appealing prospects of a quantum internet is that it should be completely tap-proof. Information in such a network is encrypted and unbreakable, and any interference with the signal readily apparent.

However, quantum information cannot be copied, so it wouldn’t work through your router.

Such information needs to be carried by entangled particles. So the team took a calcium ion, secured it in an ion trap, and blasted it with lasers. This step both ‘wrote’ the information into the ion as a quantum state and made it emit a photon (to ‘glow’, basically). Then, this photon needed to be amplified to be sent down the optic fiber.

“The photon emitted by the calcium ion has a wavelength of 854 nanometers and is quickly absorbed by the optical fiber,” says Lanyon.

The researchers sent the photon through a crystal illuminated by a strong laser to boost it up to a wavelength of 1550 nanometers. The calcium atom and light particle were still entangled even after the conversion and a 50-kilometer journey through the optic cable.

In the future, the team wants to double the distance such a particle can travel to 100 km of optic fiber, potentially enabling connections between cities. Only a handful of trapped ion-systems would be required to maintain a quantum internet link between Innsbruck and Vienna (387km/240mi), for example.

The paper “Light-matter entanglement over 50 km of optical fibre” has been published in the journal npj Quantum Information.

Yes, a quantum internet is possible, new study shows

A new study concludes that a quantum internet is feasible at a global scale, previewing what tomorrow’s global communications might look like.

Image in public domain.

Few words sound better together than “quantum” and “internet” — it sounds like the future, and it very possibly is the future. A group of researchers at the University of Padova, in Italy, carried out an experiment between satellites in orbit and a station on the ground. Using existing quantum technology they were able to pass a signal over some 20,000 kilometres (12,427 miles) of air and space without any significant interference or data loss, showcasing the feasibility of secure quantum communications on a global scale.

“Space quantum communications (QC) represent a promising way to guarantee unconditional security for satellite-to-ground and inter-satellite optical links, by using quantum information protocols as quantum key distribution (QKD),” says one of the researchers, Giuseppe Vallone from the University of Padova in Italy.

Essentially, the team carried out an exchange of pulse photons, using two different satellites in the Russian GLONASS constellation and the Space Geodesy Centre of the Italian Space Agency. These are global positioning satellite systems, similar to GPS. Together, all these systems are called the Global Navigation Satellite System (GNSS).

Co-lead author Professor Paolo Villoresi explains:

“Our experiment used the passive retro-reflectors mounted on the satellites. By estimating the actual losses of the channel, we can evaluate the characteristics of both a dedicated quantum payload and a receiving ground station.”

“Our results prove the feasibility of QC from GNSS in terms of achievable signal-to-noise ratio and detection rate. Our work extends the limit of long-distance free- single-photon exchange. The longest channel length previously demonstrated was around 7,000 km, in an experiment using a Medium-Earth-Orbit (MEO) satellite that we reported in 2016.”

While the study showed that such transmissions are feasible, it’s still only a proof of concept, and it will be a long time before anything is developed practically. In a potential quantum internet, speeds will be very slow for the foreseeable future,

The lure of a potential quantum internet comes from security — essentially, quantum communications are encrypted and unbreakable. Thanks to the nature of the technology, any interference is quickly detected, making QKD communications impossible to intercept. If a message sent through quantum technology would be hacked by a third party, it would be destroyed.

So while this likely won’t make our internet faster or cooler anytime soon, it could revolutionize a number of communications, making them more reliable and safer.

“Satellite-based technologies enable a wide range of civil, scientific and military applications like communications, navigation and timing, remote sensing, meteorology, reconnaissance, search and rescue, space exploration and astronomy,” Vallone concludes.

The study “Towards quantum communication from global navigation satellite system”, by Calderaro et al, has been published in the journal Quantum Science and Technology.

Laptop.

The World Wide Web’s inventor says we need a ‘new contract’ for the world wide web

Did the world wide web take a dark turn? Its inventor says yes.

Laptop.

Image via Pixabay.

This Monday, world wide web inventor Tim Berners-Lee announced that a new “contract” is in the works, aiming to ensure that the platform remains “safe and accessible” for all on the internet.

You were the promised one!

“All kinds of things have things have gone wrong,” said Berners-Lee at the opening of the Web Summit, Europe’s largest tech event.

“We have fake news, we have problems with privacy, we have people being profiled and manipulated,” he said.

Back in 1989, when the web was first developed, Berners-Lee saw it as a pioneer for new horizons. He saw it as “an open platform that would allow everyone, everywhere to share information, access opportunities, and collaborate across geographic and cultural boundaries,” according to an open letter he penned in 2017.

However, that vision has withered on the vine. Berners-Lee confessed he’s “increasingly worried” about how the Web is evolving. The lack of privacy, the spread of misinformation, and lack of transparency in online political advertising are some of the biggest rotten apples. All in all, he feels that the Internet has to be saved from itself — so Berners-Lee is now trying to rally companies, governments, and citizens to the cause.

His plan is to create a new “Contract for the Web.” A nod to Hobbe’s concept of the social contract, the new framework aims to ensure that the Internet is used ethically and transparently for all participants. Governments, companies, and citizens are welcomed to pitch in, and Berners-Lee plans to have the new contract ready by May 2019. The date marks the earliest estimate for when 50% or more of humanity will be connected to the Internet.

Sir Tim Berners-Lee.

“All kinds of things have gone wrong” with the internet, says Berners-Lee.

“We’ve lost control of our personal data and that data is being weaponised against us. The power to access news and information from around the globe is being manipulated by malicious actors,” his Web Foundation said in a report outlining the need for a new contract for the web.

“Online harassment is rampant, and governments are increasingly censoring information online—or shutting down the internet altogether.”

Employees of Google, Facebook, and other tech giants have also voiced similar concerns publicly over the past few months. Overall, they say the products they’ve worked on have grown to become addictive and harmful to society.

The French government, Google, and Facebook have said that they back the proposal, the Web Foundation reports — especially the right to privacy and the guarantee that anyone should be able to connect to the Web. The two firms collectively control over three-quarters of all internet traffic through their apps, such as YouTube, WhatsApp, and Instagram.

Freeing constraints

“We have big and small players, it’s not the UN of the digital world, it’s a call for voluntary engagement, for those who want to be part of the solution, whether they’re part of the problem or not,” the foundation’s policy director, Nnenna Nwakanma, told AFP.

Recent research has found that over 2 billion people live in places where internet is prohibitively expensive to access. The Web Foundation says that most people who aren’t yet online live in poor countries. It also criticized the fact that “billions” of people have to connect to the Web through “a small handful of huge companies.”

The contract (in its current, work-in-progress version) includes some short principles aimed towards three sectors: government, private companies, and citizens. They’re meant to ensure a free and open web for all, but some concerns have been raised as to their currently-vague nature. For example, one principle holds that companies “respect consumers’ privacy and personal data,” which is a very noble goal that I wholeheartedly support — but one that’s extremely hard to quantify and thus extremely tricky from a legal point of view.

Still, there is hope in sight. The initiative has been joined by over 50 high-profile partners — including Sir Richard Branson, the founder of the Virgin Group, alongside those listed earlier — who have signed the contract. With such support, the Contract may just turn out fine.

So what is your take on the issue? I grew up on the wild web of yore, and I will die on the hill of free, neutral internet if need be. It also helps that I was raised in Eastern Europe so I have a deeply-ingrained aversion against limits to freedom.

But at the same time, Berners-Lee’s warnings do have weight. Depending on where you live, you’ve likely seen your government make some dubious-if-not-outright (far-right?) authoritarian choices in regard to how the Internet works. A completely free internet also means companies like Facebook can keep selling our data to the likes of Cambridge Analytica, and that Russian trolls can keep posting weak memes on obscure groups (that somehow still sway elections and patterns of vaccination) with impunity.

One serves as a reminder that the freedom we enjoy on the Internet today will always be in the crosshairs of those who seek power and profit. The other reminds us that it can be hijacked by toxic actors looking to get away with (figurative) murder. It’s a delicate balancing act, but limiting some freedoms could end up creating a fairer, more open Web.

How would you like to see Internet 2.0?

Scientists link broadband internet access to sleep deprivation

Credit: Pixabay.

People with a digital subscriber line (DSL) access tend to sleep 25 minutes less than those without DSL. These people are also less likely to get enough sleep (7 to 9 hours) or to feel satisfied with the quality of their sleep. According to the researchers who produced the findings, the effect may be explained by time constraints in the morning and by the use of electronic devices in the evening — but not by their use throughout the day.

Researchers at Bocconi University, Italy, and the University of Pittsburgh, USA, crisscrossed data on broadband access in Germany to surveys where individuals reported their sleep duration and quality. Previously, studies have analyzed the effects of broadband access on electoral outcomes, social capital, fertility, and sex crimes. However, this is the first time that scientists have examined the causal effect of the access to high-speed Internet on sleeping behavior.

Poor sleep is a major public health hazard, which some scholars deem as the most prevalent risky behavior in modern society. In developed countries, this is an increasing problem as more and more people forgo the recommend 7-9 hours of sleep, exposing themselves to detrimental outcomes on health and cognitive performance. In Germany alone, 200,000 working days are lost each year due to insufficient sleep, translating in an economic loss of $60bln, or about 1.6% of the country’s GDP, according to a report of the RAND Corporation.

The researchers chose to focus on German citizens because the German Socioeconomic Panel (SOEP) is one of the few panel surveys containing a rich set of information both on sleep and access to high-speed Internet. Moreover, the country’s telecom infrastructure has many peculiarities that allowed the researchers to build stronger links between internet usage and sleep quality. For instance, you can see in the graph below how the territory corresponding to former East Germany (German Democratic Republic) — which is still feeling the effects of Soviet rule 30 years after the Iron Curtain fell — used to lag behind West Germany in terms of broadband access.

Today, 75% of Germany has 50 megabit internet but the federal government plans to invest 100 billion euros (US$106 billion) over eight years to roll-out gigabit internet across. Elsewhere, in Australia, companies like iSelect are planning to connect 93percentt of Australians to high-speed internet by 2021.

The figure illustrates the share of SOEP households with access to DSL for the survey years 2008, 2010 and 2012 across German counties. Darker blue areas correspond to higher levels of DSL access in the corresponding county. Credit: F.C. Billari et al. / Journal of Economic Behavior and Organization.

The figure illustrates the share of SOEP households with access to DSL for the survey years 2008, 2010 and 2012 across German counties. Dark-blue areas correspond to higher levels of DSL access in the corresponding county. Credit: F.C. Billari et al. / Journal of Economic Behavior and Organization.

The research team’s conclusion was that access to high-speed Internet reduces both sleep duration and sleep satisfaction in individuals that face time constraints in the morning for work or family reasons.

“Individuals with DSL access tend to sleep 25 minutes less than their counterparts without DSL Internet. They are significantly less likely to sleep between 7 and 9 hours, the amount recommended by the scientific community, and are less likely to be satisfied with their sleep,” said Francesco Billari, a Full Professor of Demography at Bocconi University, Milan.

Average sleep hours (Weekdays) by DSL Access. Credit: F.C. Billari et al. / Journal of Economic Behavior and Organization.

Average sleep hours (Weekdays) by DSL Access. Credit: F.C. Billari et al. / Journal of Economic Behavior and Organization.

The increased use of electronic devices in the bedroom before sleep is considered one of the main factors contributing to the sleep deprivation epidemic and access to high-speed internet promotes excessive electronic media use.

“Taken together, our findings suggest that there may be substantial detrimental effects of broadband internet on sleep duration and quality through its effects on technology use near bedtime. High-speed Internet makes it very enticing to stay up later to play video games, surf the web and spend time online on social media. Given the growing awareness of the importance of sleep quantity and quality for our health and productivity, providing more information on the risks associated with technology use in the evening may promote healthier sleep and have non-negligible effects on individual welfare and well-being. More research is needed to understand the behavioral mechanisms underlying Internet addiction and how to nudge individuals into healthier sleep practices,” the researchers concluded in the Journal of Economic Behavior & Organization.

SpaceX will launch three times more satellites than there currently are in orbit to give you fast internet

The ultimate goal of SpaceX has always been to reach Mars, but in the meantime it seems the visionary private space company has its eyes sets on other targets. Yesterday, we learned more about SpaceX’s plans to launch a huge fleet of tiny satellites into Earth’s low orbit to connect every part of the globe with fast internet. As many as 4,425 satellites would be launched part of the initiative or nearly three times as much as there are active satellites currently in orbit.

Space internet

According to Patricia Cooper, who is SpaceX’s vice president of satellite government affairs, the first prototype satellite will go into this space as early as this year, followed quickly by another launch in early 2018. These first attempts will serve as a proof of concept so that SpaceX can assess whether or not it’s feasible to send a couple thousand more. In 2019, SpaceX expects to seriously start deploying satellites in low-orbit. By 2024, SpaceX ought to have an entire network of satellites capable of delivering internet to any part of the globe, and presumably cheap too.

The thousands of tiny satellites will operate in 83 planes at an altitude between 1,100 kilometers and 1,325 kilometers. These will be linked with ground control centers, gateway stations, and other Earth-based facilities to offer a steady connection somewhere between cable and fiber-optic in quality and a latency of around 35ms.

When it’s all said and done, the project should amount to $20 billion in investment, with Google and Fidelity already investing $1 billion with SpaceX for this particular purpose.

Whatever’s the case, we’re talking about a massive scale with a high potential for disruption in the industry. Providers won’t have to invest nearly as much in digging trenches, hundreds of thousands of miles of cables, and so on. Instead, they’ll just rent the service from SpaceX, or that’s what likely Elon Musk’s company hopes.

“Once fully deployed, the SpaceX System will pass over virtually all parts of the Earth’s surface and therefore, in principle, have the ability to provide ubiquitous global service,” the FFC application reads which SpaceX filled a while ago. “Every point on the Earth’s surface will see, at all times, a SpaceX satellite.”

An additional 7,500 satellites might be deployed at an even lower plane from the main fleet to further boost capacity and reduce latency. By now, some of you might be wondering how on Earth will SpaceX pull this off? Well, earlier this year, India broke the record for the most satellites launched on a single rocket with 104 satellites. Most were nanosats that only weigh 6 kilograms (13 pounds) so it’s foreseeable it won’t be long before SpaceX shatters this record.

SpaceX is also growing better and better at launching, landing, and then reusing the same Falcon 9 rockets. This puts it at a considerable advantage over its competitors as it can cut down costs a couple of fold, at least. At the same time, the huge amount of satellites SpaceX is considering can only exacerbate a growing space junk problem, as reported earlier by us in more depth. 

The announcement is still fresh but seeing how the first prototypes will be deployed later this year, things are destined to move very fast. It will be interesting to learn more as developments arise. It would be particularly interesting to see how Facebook reacts to the news, for instance. Not too long ago, a Falcon 9 rocket was destroyed shortly after liftoff and with it Facebook’s $200-million satellite destined to connect users in Africa to the internet. Facebook’s initiative called internet.org wants to bring a free interent connection to those people who most need it, particularly in developing countries around the world. Internet.org has come under fire, though, because it’s not really offering free internet — it’s free access to a limited variety of web services, with Facebook and partners at the center. As Mahesh Murthy wrote for QZ, it’s “poor internet for poor people”.

I have a feeling SpaceX’s vision of a fast internet connected over satellite will be radically different. Perhaps it will be even more business orientated than Facebook’s approach but certainly net neutral. We’ll just have to wait and see.

Privacy Policy Keyboard.

Yesterday, US officials said you had no right to online privacy — we don’t agree so here’s Internet Noise to help you out

In the wake of yesterday’s decision by the House of Representatives to allow internet service providers to sell browsing data, one programmer is determined to make that data as worthless as possible — and he’s willing to share his work.

If you’re anything like me, when the House of Representatives decided yesterday that ISPs can sell your browsing data to basically anyone, you were positively furious. The word bull and something closely resembling the word “ship” rolled around in my head like a marble in a cup. I go to the Internet partly to work, partly to disconnect from the real world. And I like my privacy for both of those things.

Privacy Policy Keyboard.

“You can’t have it.” — bunch of US officials.

Harsh, I still want my privacy. I wanna browse pictures of cats in peace, then share a laugh over them without someone uninvited seeing any line of chat. I want to read NASA’s latest tidbits without the NSA (subtracting an A makes a huge difference) peering over my shoulder.

It’s my experience. It’s my little corner of the immaterial. I don’t want anyone to burst in on it. If I wanted to be under constant surveillance I’d fly to London. But I don’t, so I just Google-Map London and use the tiny yellow guy to see the sights.

*sigh*

That’s not how it works, though, and I know that. ISPs keep track of everything you do because they actually connect you to the disjointed bits and servers to create the seamless Internet we know and love. For the most part, they had to keep this data to themselves, so we had some modicum of privacy. That’s about to change for those of you living in the US, congrats, since that data is now up for grabs by anyone who can pay for it — and make no mistake, people will pay for it, profile you with it, and then try to sell you stuff according to that profile. Because capitalism.

I’m not a fan of that. Somehow it manages to have this 1984-meets-Brave New World vibe and I don’t want any of it, no siree. Paint me a barbarian but I’d rather not get a 10% discount on something I may actually want if it means a server somewhere is crunching my 3 AM alcohol-fueled-research of exotic cuisine on Wikipedia like so many 1’s and 0’s.

Luckily, there’s one brave soul out there who feels the same way I do but also has the skills to do something about it. His name is Dan Schultz, and he has the next-best-thing after Internet invisibility. Dan heard about the vote on Twitter somewhere around 1 AM, turned off Zelda and coded Internet Noise — a tool which will shotgun searches in your browser left and right, all in the name of foggifying your real searches in a deluge of random ‘noise’.

“I cannot function in civil society in 2017 without an internet connection, and I have to go through an ISP to do that,” he says.

Hiding in plain sight

Internet Noise acts like your run of the mill browser extension, but in truth, it’s just a website which will auto-open a bunch of random Google search tabs. The idea is that if you can’t keep an ISP from profiling you, you can at least give them a false image of yourself. It’s a pretty sad thing to need such a tool, and Schultz himself hopes that Noise will help people understand the risk their online privacy is under at this point.

It’s a pretty straightforward program. Schultz simply googled “top 4,000 nouns” and made a gibberish-list with all of them. With a click on “Make some Noise”, Internet Noise draws on the magic powering Google’s “I’m feeling lucky” button to search for those terms or permutations of them, opening five tabs of results. Ten seconds later, you get another five, then five more, and so on. It will keep going until you hit “STOP THE NOISE!”, by which point your browsing history should look like a potpourri of random links. Schultz says the best way to use it is to start the Noise when you call in for the night and stop it the next day.

Soon enough, you’ll start seeing some pretty random stuff popping up in your Facebook feed, for example. Stuff you won’t have the first clue as to what it is, and that that’s proof the Noise is working, muddying your Internet activity profile, causing algorithms to spew out all kinds of false positives.

Privacy keyboard.

Image credits g4ll4is / Flickr.

Still, it’s not a do-all-end-all program. Anyone slightly more competent than your average advertising company could probably pick out your searches from the noise with a decent success rate since they’re obviously random clicks that have little follow-through. With 4,000 terms and 16,000,000 two-word combinations of them to rifle through, it’s also really unlikely to visit a page once and astronomically unlikely to visit it three or more times. It’s a really random fog-maker and its activity doesn’t look human or plausible enough to be truly good at masking your activity. A smart enough algorithm can probably pick Noise apart in a few seconds. But not all algorithms are smart.

It might even get you into some more hardcore surveillance if the program searches add up to something which appears sketchy. Schultz obviously hasn’t been able to pedigree all the terms to see if any could land you in a spot of trouble — think “pipes”, “industrial fertilizers”, and  “do we really need the government” in one night. I’m exaggerating on that last one just to prove a point.

At the end of the day, though, Schiltz says the main point is to raise awareness. However, the project is open source and could evolve into a more complex program. People are already contributing, fixing minor bugs and some are suggesting possible improvements. But until more efficient privacy kits become available, your only real option is to learn as much as you can about what the tools you use can and can’t do and try to dodge the system as well as possible beyond what they offer.

But I do harbor hope. Internauts have never had much political traction, but they’ve never lacked for imagination, resourcefulness, and a brash commitment to stick it to the man when his ethics fall into question. The Noise might be feeble, but its offspring won’t.

If you missed it, here’s a link to Internet Noise:

[button url=”https://slifty.github.io/internet_noise/index.html” postid=”” style=”btn-danger” size=”btn-lg” target=”_self” fullwidth=”false”]Noise-me![/button]

 

Superdense-coded logo of an oak leaf sets new record for transfer rate over optic cable

Department of Energy researchers working at the Oak Ridge National Laboratory have just set a new world record for data transfer speed. They relied on a technique known as superdense coding, which uses properties of elemental particles such as photons or electrons to store much more information than previously possible.

Image credits Thomas B. / Pixabay.

The Oak Ridge team has achieved a 1.67 bits per qubit (quantum bit) transfer rate over a fiber optic cable, a small but significant improvement over the previous record of 1.63 per qubit.

Awesome! What does it mean though?

One of the most fundamental differences between a traditional computer and a quantum one is how they encode and transmit data. Computers do it in bits — 1s or 0s. Quantum computers do it in qubits, which can be both a 1 and a 0 at the same time — bending minds and limits on stored information at the same time. The team, composed of Brian Williams, Ronald Sadlier, and Travis Humble has used a physical system similar to that seen in the latter, which are widely touted for the speed with which they solve complex problems.

They were the first to ever transmit superdense code over optical fiber, a major step forward if we want to use quantum communication without re-installing every cable in the world. ORNL’s oak-leaf logo was chosen to be the fist message ever transmitted with this technique, sent between two terminals in the lab. The exact mechanisms of this process sounds more like hardcore sci-fi than actual science but hey — it’s quantum physics.

“We report the first demonstration of superdense coding over optical fiber links, taking advantage of a complete Bell-state measurement enabled by time-polarization hyperentanglement, linear optics, and common single-photon detectors,” the team writes.

The team used run of the mill laboratory equipment such as common fiber optic cable and standard photon detectors, meaning their technique is already suited for practical use.

Right now, the technology remains largely experimental. Potential applications are very enticing though, including a novel, cost-effective way of condensing and transferring dense packages of information at high speed. The main winner in this is of course, the Internet — the tech could allow for anything from less buffering time on Netflix to improved cybersecurity applications.

“This experiment demonstrates how quantum communication techniques can be integrated with conventional networking technology,” Williams said. “It’s part of the groundwork needed to build future quantum networks that can be used for computing and sensing applications.”

The full paper “Superdense coding over optical fiber links with complete Bell-state measurements” has been published in the journal Physical Review Letters, where it was selected as an “Editor’s Suggestion” paper.

Here’s why there was no Twitter on Friday — it’s way scarier than you think

You might have noticed something strange in your Internet adventures last Friday — the distressing absence of a large part of it. An official statement from Internet provider giant Dyn released Friday explains what happened, and why it might happen again.

Image credits Blondinrikard Fröberg / Flickr.

Large sections of the Internet became basically inaccessible last week, as three massive Distributed Denial of Service (DDOS) attacks hit a company called Dyn. This company provides Domain Name Services (DNS) hosting for hundreds of websites including Twitter, Reddit, Amazon, Netflix, PayPal and so on. A DNS host basically “places” a website on the web, by connecting each computer’s IP address to the domain names of sites a user is trying to access, such as “ZMEScience.com”. Take the host out of the equation, and the other two can’t communicate — like cutting the chord between two landlines.

A DDOS attack consists of a large number of computers which simultaneously issue a massive number of fake visits on a server, basically flooding a website with connection requests, information requests — anything to keep the servers busy. Because the website host can’t tell which of the requests are valid and which are fake, they have to let them all through. The servers overload, buckle, and then nobody can access them anymore. Now, for the scary bit.

Welcome to the Internet of Things

DDOS’s are one of the oldest tricks in the book. As such, hosting companies like Dyn have robust systems in place to deal with them. They test their system against mock “stresser” services, which do the same thing, regularly. Hackers looking to launch a denial of service attack have to create specific software, then infect as many computers as possible (the botnet) and run shell programs off of them — the bigger the botnet, the more powerful the flood.

For the most part, PCs have (at least) decent firewalls and antivirus programs that defend them against this type of software. So it can be hard for hackers to gain the numbers to make a dent in servers such as the ones Dyn uses. Hosting companies just have to make sure their servers can handle more traffic than hackers can realistically throw towards them, and that’s that.

Friday’s attacks, however, used a new approach: the botnet wasn’t made up of computers like the one you’re reading this article on, but other kinds of digital devices connected to the web. Think gadgets such as smart TVs, security cameras, DVRs, webcams, even web-connected thermostats and coffee makers — collectively known as the Internet of Things (IoT). It’s a ridiculously huge entity, but these devices have lousy security for the most part. When’s the last time you changed the username and password on your fridge? Exactly.

Because users don’t update these devices’ software, use factory-set accounts and passwords, and vulnerable coding, these devices are easy to hack en-masse. Dyn’s chief strategy officer Kyle York said the company recorded tens of millions of IP addresses in the attack, a huge botnet of IoT devices turned towards bringing down their DNS services.

We hope you’ll enjoy your stay.
Image credits Ian Kennedy / Flickr.

Krebsosecurity reported that a piece of malware called Mirai was involved in the attack, The program allows pretty much anyone to create personal botnet armies, after its source code was released last month on the Internet.

“Mirai scours the web for IoT devices protected by little more than factory-default usernames and passwords, and then enlists the devices in attacks that hurl junk traffic at an online target until it can no longer accommodate legitimate visitors or users,” Krebs, a US security blogger, explained.

Since then, Chinese electronics company XiongMai has recalled its products, after discovering that its surveillance cameras were used in the attack. This is a particularly disturbing problem as many companies who sell security oweb cameras buy their tech from XiongMai, put on a fresh coat of paint and sell them under their own brand name. So yes, the webcam you’re staring down on right now could very well be XiongMai tech.

 

“It’s remarkable that virtually an entire company’s product line has just been turned into a botnet that is now attacking the United States,” Flashpoint’s researcher Allison Nixon told Krebs. “Some people are theorising that there were multiple botnets involved here. What we can say is that we’ve seen a Mirai botnet participating in the attack.”

Dyn was ultimately able to restore hosting services on Friday, and with it, access to Twitter, Amazon, and all the other sites. But this attack could be just a preview. The complexity of botnet systems like Mirai and the vulnerability of IoT devices paint a pretty grim picture between them.

“[I]nsecure IoT devices are going to stick around like a bad rash – unless and until there is a major, global effort to recall and remove vulnerable systems from the internet,” explains Krebs. “In my humble opinion, this global clean-up effort should be funded mainly by the companies that are dumping these cheap, poorly-secured hardware devices onto the market in an apparent bid to own the market. Well, they should be made to own the cleanup efforts as well.”

Just in case you missed it, you can read Dyn’s statement here.

Internet schools man for trying to “mansplain” a NASA astronaut

Jessica Meir is a NASA astronaut who also works as an Assistant Professor of Anesthesia at Harvard Medical School and a postdoctoral researcher in comparative physiology at the University of British Columbia. She recently tweeted from outer space, presumably happy about the experience.

But a man reported that this isn’t “spontaneous” – ‘it’s simple thermo,’ he said.

Of course, as an astronaut and and a Ph.D in marine biology, it’s pretty safe to say that Mier knows her ‘simple thermo,’ but this is a case of mansplaining: explaining something to someone, typically a woman, in a condescending or patronizing manner.

Dr. Paul Coxon tweeted the exchange and it quickly went viral, before the man deleted his Twitter account. Not long after that, Coxon too deleted his tweet, arguably due to the coverage it was getting. The replies the conversation was getting were also hilarious:

 

 

Yeah, that went really well.

Just to clarify – there’s nothing wrong in trying to have a conversation with a researcher or an astronaut online, and there’s also nothing necessarily wrong in contradicting them. However, the assumption that the person you’re talking to doesn’t understand a simple topic – especially when she’s clearly qualified – is wrong.

Net neutrality wins in Europe – a victory for the internet as we know it

New net neutrality guidelines from the Body of European Regulators for Electronic Communications (BEREC) confirm strong protection for net neutrality – ensuring the state of the open web as we know it.

net-neutrality-1024x640px_11832

A discussion of tremendous importance for the internet was taking place these days, although most of us weren’t even aware of it. Europe’s telecommunications regulator was debating how to regulate some loopholes regarding internet service providers (ISPs).

“Europe is now a global standard-setter in the defence of the open, competitive and neutral internet. We congratulate BEREC on its diligent work, its expertise and its refusal to bend to the unreasonable pressure placed on it by the big telecoms lobby”, said Joe McNamee, Executive Director of European Digital Rights (EDRi).

That might not sound like much, but it actually matters a lot.

Why this matters

Net neutrality means that all internet traffic is treated equally, without blocking or slowing down different websites or information sources. Let’s take an example. If this wouldn’t be the case, then your internet provider would be allowed to keep Netflix streaming at full speed, while significantly slowing down Amazon or any other source (if they had an agreement in this sense).  Naturally, they’d create a bias towards watching Netflix.

But it goes even deeper. Your provider could, for instance, block or slow down some media outlets, directing you to the sources of information it wants. Basically, it could indirectly direct people to one website or another.

Ensuring that the open web stays the way it is is vital not only for a healthy competition between all websites, but also to avoid potential mass manipulation by ISPs. If you want, it’s a way to keep the internet free and open for all – and this is exactly what was decided in Europe.

“ISPs are prohibited from blocking or slowing down of Internet traffic, except where necessary,” BEREC said. “The exceptions are limited to: traffic management to comply with a legal order, to ensure network integrity and security, and to manage congestion, provided that equivalent categories of traffic are treated equally.”

So why was this debated anyway?

As the Verge writes, the net neutrality rules adopted by the European Parliament last year aimed to strengthen net neutrality by requiring internet service providers to treat all web traffic equally, without favoring some services over others. However, these regulations contained several loopholes which could have been abused, especially by allowing ISPs to create “fast lanes” for “specialized services.” If you’re deciding between reading two online newspapers for instance, the speed at which they load can be a deciding factor, and if your ISP tampers with the load speed, then that’s also a form of manipulation. Thankfully, this won’t be the case.

“ISPs are prohibited from blocking or slowing down of Internet traffic, except where necessary,” BEREC said. “The exceptions are limited to: traffic management to comply with a legal order, to ensure network integrity and security, and to manage congestion, provided that equivalent categories of traffic are treated equally.”

ZME Science strongly supports net neutrality

Both as individuals and as a publisher, we support net neutrality. This is a global issue of paramount importance, with the potential of affecting our lives in a great way. Thankfully, net neutrality has enjoyed victories in the USA, India and Latin America, and now Europe has joined in as well.

But society must remain vigilant. It’s up to the civil society, to you and I, to make sure that these rules are enforced and that other forms of internet manipulation don’t creep up on us.

 

Relying too much on the Internet for fact finding could hurt your brain

With the huge data repositories of the internet just one finger tap away, our brains may be starting to slack in the memory department. A new study found that our increasing reliance on the Internet may be affecting our ability to solve problems or learn and recall facts.

The Internet is a monumental achievement of humanity. The speed and distance over with which it allows us to share data are nothing short of staggering. Just 30 years ago, the only way to share your Snapchat pics with anyone would have been through the post office, or a fax — now you can send them to anyone in the world in less time than it takes to say Snapchat.

Image credits Tony Webster / Flickr

Needless to say, today the Internet has become the way to transmit data. But brains may actually be affected, not uplifted, by this huge resource at our disposal.

Researchers at the University of California, Santa Cruz and the University of Illinois, Urbana-Champaign have found that users show a compounding “cognitive offloading” effect when they use the Internet over their own memories. In other words, our brains have a tendency to rely on online information over its own memories or conclusions, and this only becomes worse the more you do it.

The team wanted to determine our likelihood to reach for a computer or smartphone when required to answer questions. For their experiment, they divided the participants into two groups and had them answer some trivia questions. At first, one group was told to use only their memory and the other only Google. But for the subsequent questions, which were easier to answer, they were given the option to answer either from memory or using information available online.

Participants who had previously relied on Google for answers were more likely to use the Internet to answer than those who relied on their memory. They also spent less time trying to answer a question on their own before turning to the Internet. Actually, 30% of participants who had previously used the Internet to answer didn’t even try to answer a single question from memory.

“Memory is changing. Our research shows that as we use the Internet to support and extend our memory we become more reliant on it,” said lead author Dr Benjamin Storm.  “Whereas before we might have tried to recall something on our own, now we don’t bother. As more information becomes available via smartphones and other devices, we become progressively more reliant on it in our daily lives.”

The results indicate that employing a medium for direct fact-finding makes it much more likely that you’ll keep using it in the future, even if you may already hold the answers.

So does this mean that the Internet is turning us into mindless zombies? Probably not, but only time can tell. What is certain however is that as the Internet becomes more comprehensive, dependable, and faster than the human memory, remembering trivial facts, numbers, or figures is inevitably becoming less necessary to function in everyday life. And our brains are more than willing to let it take over and slack off.

The full paper, titled “Using the Internet to access information inflates future use of the Internet to access other information,” has been published online in the journal Memory.

Access to Internet is a basic human right, the UN decides

The UN has ruled that from now on, Internet access will be counted as one of every human’s basic rights. This is stipulated in a freshly passed resolution for the “promotion, protection, and enjoyment of human rights on the internet.” The document also condemns any government that willingly disrupts their citizens’ free access to the Internet.

Seattle residents protesting the TPP in February of 2015.
Image via flickr user Backbone Campaign

It’s a great time for the Internet! The resolution passed last Friday by the UN stresses that “the same rights people have offline must also be protected online,” lending even more weight to the freedom of expression protected by articles 19 of the Universal Declaration of Human Rights (UDHR) and the International Covenant on Civil and Political Rights.

“The resolution is a much-needed response to increased pressure on freedom of expression online in all parts of the world”, said Thomas Hughes, executive director of Article 19, a British organization working to promote freedom of expression and information.

“From impunity for the killings of bloggers to laws criminalising legitimate dissent on social media, basic human rights principles are being disregarded to impose greater controls over the information we see and share online,” he added.

The document is an official recognition of one simple fact: the Internet is an incredibly powerful tool that “facilitates vast opportunities for affordable and inclusive education globally” and we should all be free to access and use it — regardless of what our governments are after or what society thinks our role should be. Increasing Internet access and the spread of technology is one of the goals of the 2030 Agenda for Sustainable Development as they play a central role in “accelerating human progress.” It highlights a few goals that countries should strive for to ensure freedom of expression on the web:

  • Addressing security concerns in “a way that ensures freedom and security on the Internet.”
  • Creating a framework so that human rights violations and abuses against persons exercising their human rights are accountable by law.
  • Recognizing the importance of online privacy.
  • Recognizing the importance of education for women and girls in relevant technology fields.

The UN’s decision is a huge step forward for those who strive for an Internet where everyone is equal and free, but it’s not a final ruling on the subject. The resolution passed with a majority vote, being supported by 70 countries. It was opposed by countries including Russia, China and Saudi Arabia, which was to be expected given their political climate. What was surprising however was the opposition from democratic countries like South Africa, India or Indonesia. The issue was one of the passages that “condemns unequivocally measures to intentionally prevent or disrupt access to our dissemination of information online.”

“We are disappointed that democracies like South Africa, Indonesia, and India voted in favour of these hostile amendments to weaken protections for freedom of expression online…A human rights based approach to providing and expanding Internet access, based on states’ existing international human rights obligations, is essential to achieving the Agenda 2030 for Sustainable Development, and no state should be seeking to slow this down,” Hughes added.

“Governments must now act on the international commitments in this resolution to protect freedom of expression and other human rights online, at all times.”

The biggest shortcoming of the resolution is the fact that it’s non-binding in nature — it can’t be enforced legally. Such a high-level ruling is bound to create awareness worldwide, giving the public some solid footing to deal with their governments on this issue. Until the UN comes to a final legally binding decision, however, it’s only going to be a tiny foothold.

 

New NASA transfer protocol makes space Wi-Fi better than yours

NASA has been working on a space-friendly internet technology for years and earlier this month their efforts have been rewarded. The agency has installed the first functioning Delay/Disruption Tolerant Networking (DTN,) system aboard the ISS. It is expected to improve data availability and automate transfer for space station experimenters, resulting in more efficient bandwidth utilization and more data return.

The DTN protocol would allow for data to be reliably transferred through unstable channels, allowing for storing data in nodes until transfer can be performed.
Image via NASA.

Keeping an open line between our planet and outer space is a very difficult task at best. The huge distances involved are the foremost problem, but there’s also radiation waves to consider plus planets and asteroids and spacecraft whizzing about, blocking the signal.

Up to now, NASA handled data transfer through three networks of distributed ground stations and relay satellites, supporting both their own and non-NASA missions: the Deep Space Network (DSN), the Near Earth Network (NEN), and the Space Network (SN). All of them transfer information using point-to-point (or direct) relaying between two nodes — similarly to how a telephone landline works.

The problem is that successful space exploration requires the ability to exchange data, a lot of data, fast and reliably, between many different nodes. It’s not something you can handle over the phone, even with the most stable of lines. So NASA has been looking to adapt the terrestrial Internet, on a much wider scale, for space use.

The result is called Delay/Disruption Tolerant Networking, and has been in the making for a few years now. The main difference between the DTN protocol and that those governing a wireless network down here is in how they handle data transfer. For you and me, when something blocks our Wi-Fi the connection slows or disconnects entirely. The DTN protocol however stores data if a connection becomes interrupted, and then forwards it using relay stations to its intended destination. This means the network can function even when a recipient server is offline.

To create the DTN, NASA enlisted the help of one of the pioneers of the Internet, Dr. Vinton G. Cerf, Google vice president and a distinguished visiting scientist at NASA’s Jet Propulsion Laboratory in Pasadena, California. He predicts the technology will bring many benefits in space as well as on Earth, especially in disaster relief conditions.

“Our experience with DTN on the space station leads to additional terrestrial applications especially for mobile communications in which connections may be erratic and discontinuous,” said Cerf. “In some cases, battery power will be an issue and devices may have to postpone communication until battery charge is adequate. These notions are relevant to the emerging ‘Internet of Things’. ”

NASA installed the first DTN system earlier this month in the ISS‘s Telescience Resource Kit (TReK) — a software suite for researchers to transmit and receive data between operations centers and their payloads aboard the station. NASA reports that adding this service on the station will also enhance mission support applications, including operational file transfers.

 

Engineers Just Smashed Record for Fast Wireless Data Trasmission: 6 Gigabits per second

Wi-Fi just got way more faster. A team of researchers from the Fraunhofer Institute for Applied Solid State Physics just beat the previous record by a factor of 10.

The team prepares its transmitter. (Image: Photo Jörg Eisenbeis, KIT)

In order to achieve this feat, they used signals in the 71–76 GHz radio frequency band to send the data. This frequency is used mostly for terrestrial and satellite broadcasting. To make things even better, they achieved an impressive signal-to-noise ratio, avoiding bandwidth waste. According to Gizmodo, they devised a system of ultra-efficient transmitters and receivers. The transmitters are based on semiconductor chips made gallium-nitride, which provide a high-power signal that’s transmitted from a focused parabolic antenna. The team declared:

“Transmitting the contents of a conventional DVD in under ten seconds by radio transmission is incredibly fast – and a new world record in wireless data transmission. With a data rate of 6 Gigabit per second over a distance of 37 kilometers, a collaborative project with the parti­ci­pa­tion of researchers from the University of Stuttgart and the Fraunhofer Institute for Applied Solid State Physics IAF exceeded the state of the art by a factor of 10.”

They transmitted the data between a 45-story tower in central Cologne and the Space Observation Radar in Wachtberg, 23 miles away.

This improvement in technology and infrastructure has obvious applications. We can all appreciate some faster internet, but it could be even more useful in places where wired connections simply aren’t possible, or for emerging technologies around the Internet of Things and Industry 4.0.

data transfer

Scientists shuttle data at 1.125 Tbps or 50,000 more than your average UK broadband

British researchers at the University College London set the record for the fastest data transfer rate: a mind-boggling 1.225 Tbps/second. That’s 50,000 faster than the average UK broadband (24 MBs/s) or just fast enough to download the entire Game of Thrones series in HD in just one freaking second.

data transfer

Image: Pixabay CC0 Public Domain

The achievement was possible by using an optical communications system that combined multiple transmitter channels and a single receiver. Researchers set up fifteen different channels each carrying an optical signal of different wavelengths. By grouping these channels together a ‘super-channel’ was created By modulating the format and code rate for each channel, the researchers were able to maximize the data transfer rate.

Super-channels aren’t entirely new and have been used for a while to shuttle vast amounts of data between cities, countries and even continents. These can operate at up to  500 Gbps, employing multiple coherent carriers  that are digitally combined to create an aggregate channel of a higher data rate on a single high-density line card that can be deployed in one operational cycle. UCL researchers happened to find the best way to encode the signals for maximum efficiency.

Lead researcher, Dr Robert Maher, UCL Electronic & Electrical Engineering, said: “While current state-of-the-art commercial optical transmission systems are capable of receiving single channel data rates of up to 100 gigabits per second (Gb/s), we are working with sophisticated equipment in our lab to design the next generation core networking and communications systems that can handle data signals at rates in excess of 1 terabit per second (Tb/s).”

Also, it happens that the system the researchers used directly emitted signals into the single receiver. In real life, of course, this data would be shuttled across vast distances where interference is bound to happen. Still, very promising to hear that data can be transferred this fast.

Findings were reported in Scientific Reports.

 

Having access to the Internet changes the way you think

The Internet is a wonderful and wonderfully powerful place. Just think about it, if your parents needed an article to show their college friends that nah-i’m-totally-right-and-you’re-not (it’s a big part of college life) they had to go looking in a library — you have access to almost all of human knowledge with just a few key strokes.

Or a few minute’s walk.
Image via wikimedia

But it turns out that having such pervasive access to information may actually make us rely less on the knowledge we already have, altering how we think, found University of Waterloo Professor of Psychology Evan F. Risko in a recent study published in the journal Consciousness and Cognition..

For the study, 100 participants were asked a series of general-knowledge questions (such as naming the capital of France.) For the first half of the test participants didn’t have access to the Internet, and would indicate whether they knew the answer or not. In the second half, they had Internet access and were required to look up the answers they reported they didn’t know.

In the end, the team found that when the subjects had access to the web they were 5 percent more likely to report they didn’t know an answer, and in some contexts, they reported feeling as though they knew less compared to the ones without access.

“With the ubiquity of the Internet, we are almost constantly connected to large amounts of information. And when that data is within reach, people seem less likely to rely on their own knowledge,” said Professor Risko, Canada Research Chair in Embodied and Embedded Cognition.

The team believes that giving people access to the internet might make it seem less acceptable to them to say that they know something but be incorrect. Another theory they considered is that people were more likely to say they didn’t know the answer because looking it up on the web gave them an opportunity to confirm their knowledge or satiate their curiosity, both highly rewarding processes.

“Our results suggest that access to the Internet affects the decisions we make about what we know and don’t know,” said Risko. “We hope this research contributes to our growing understanding of how easy access to massive amounts of information can influence our thinking and behaviour.”

Professor Risko says he plans to further the research in this area by investigating the factors that lead to individuals’ reduced willingness to respond when they have access to the web.