Tag Archives: YouTube

This guy on Youtube makes knives from foods, sand, and other crazy materials

Every once in a while, you come across something on Youtube that makes you go ‘What?!’. This was exactly one of those cases.

I found kiwami’s channel randomly, from this crazy video on making a knife by microwaved sand. Yes, really. Yes, it works — and yes, it’s crazy sharp. Here’s the video, more follow below.

As if that wasn’t crazy enough, kiwami (whose channel has garnered almost 500 million views) has a wealth of videos on making sharps from… things (there’s really no better way to put it).

Among others, he made knives from candy, chocolate, fungi, seawater, tofu, teeth, bismuth, potatoes, and the list goes on. There’s really no way to describe how the Japanese Youtuber does it, but one thing’s for sure: this isn’t click bait, even his milk knife is crazy sharp.

There’s something about this application, dancing between physics, chemistry, art, and entertainment (because the vids are also funny and engaging).

The Japanese aesthetic is also very strong — there’s no music, no voiceover, just simple, descriptive videos of a person making sharp knives from things that have no business being knives. It’s strangely relaxing.

Here are a few other of his videos, there are plenty more on kiwami’s channel.

How we get trapped in the same Youtube loops

If you use Youtube a lot, you’ve probably come across this: the same suggestions pop again and again, and it often seems like we’re just trapped in a bubble.

The two main types of Youtube videos: popular, ‘bubbly’ videos (left), and diverse, exploratory videos (right).

Camille Roth, a researcher at the French National Centre for Scientific Research, studied something rather unusual in the world of science: Youtube. Or rather, Youtube’s recommendation algorithm.

Along with his colleagues, he explored recommendations from a thousand videos on different subjects — a total of half a million recommendations. They then compared how different videos sent the users on different paths, based on these video recommendations.

Illustration of the recursive recommendation structure of the Youtube videos. Credits: Roth et al.

Since almost 2 billion people use Youtube every month, the content recommendations that the site makes can have an effect on a large chunk of mankind. The popular belief is that Youtube — and social media, in general — tends to form confinement bubbles (or echo chambers).

In this case, the popular belief seems to be true.

While some social networks thrive on helping users find new types of content, that is not often the case on Youtube.

When researchers looked at the recommendation chains formed, they found that the Youtube algorithm often tends to confine people in the same bubbles, promoting the same type of content over and over again.

“We show that the landscape defined by non-personalized YouTube recommendations is generally likely to confine users in homogeneous clusters of videos. Besides, content for which confinement appears to be most significant also happens to garner the highest audience and thus plausibly viewing time.”

Example of video networks. Image credits: Roth et al.

It tends to work like this: when you watch a video, you essentially enter a network of interconnected videos that can serve as recommendations. Depending on which video you start with, the recommendation network is more or less closed — which leads to more similar or more different content.

In addition, the content that leads to the most confined recommendation networks also seem to revolve around the most viewed videos or the ones with the longest viewing time — in other words, the more popular a video is, the more likely it is to send you in a closed loop, which creates a self-enforcing mechanism.

“To simplify our findings very roughly and informally, let us say that there are two main stereotypes of YouTube videos: some with a high number of views, featuring loopy and not very diverse suggestions, and some with a low number of views, which generally feature more diverse and exploratory suggestions,” the researchers explain.

It’s important to keep note of this as you’re using Youtube, particularly if you’re watching polarizing or biased videos: the more you look at something, the likelier it is for the algorithm to suggest more similar things and reinforce the bias.

The study has been published in PLoS.

Youtube.

YouTube conspiracy theorists dominate climate science content by hijacking search terms

YouTube is rife with false info regarding climate change, a new study finds.

Youtube.

Image via Pixabay.

If you’re planning to go online and watch a few informational videos about climate change over dinner, I have some bad news: a new study reports that some scientific terms (such as ‘geoengineering’) are being dominated by conspiracy theorists. These individuals have ‘hijacked’ the terms so that searches take users to a list of almost entirely non-scientific video content.

The authors recommend that influential YouTubers, politicians, and influential individuals in popular culture work together to ensure that scientifically-accurate content reaches as many people as possible.

WrongTube

“Searching YouTube for climate-science and climate-engineering-related terms finds fewer than half of the videos represent mainstream scientific views,” says study author Dr. Joachim Allgaier, Senior Researcher at the RWTH Aachen University.

“It’s alarming to find that the majority of videos propagate conspiracy theories about climate science and technology.”

YouTube is a humongous platform. Almost 2 billion logged-in users visit it every month, which is roughly half the online world. Many people, including yours truly, see YouTube as a great resource for learning, and many channels produce accessible content about science, health, and technology. However, whether this content is reliable or not is a whole different discussion.

Allgaier wanted to know the quality of the information users find when searching for climate change and climate modification — it turns out much of it is complete baloney.

“So far, research has focused on the most-watched videos, checking their scientific accuracy, but this doesn’t tell us what an average internet user will find, as the results are influenced by previous search and watch histories,” reports Allgaier. “To combat this, I used the anonymization tool TOR to avoid personalization of the results.”

Allgier searched for ten climate change-related search terms and analyzed 200 of the videos YouTube showed him (these videos all treated climate change and climate modification topics). Most of these videos go directly against the worldwide scientific consensus, as detailed by the UN Intergovernmental Panel on Climate Change, he reports.

Contrails.

Consensus such as “chemtrails are a conspiracy theory”.
Image via Pixabay.

Many of these videos propagated the chemtrail conspiracy theory, Allgeier explains. In broad lines, chemtrailers believe that the condensation trails airplanes generate are purposefully laced with harmful substances to modify the weather, control human populations, or to carry out biological and chemical warfare. I don’t think it needs to be said, but there is no evidence to support this theory.

Worryingly, however, Allgaier found that these theorists have taken over some scientific terms by mixing them into their content. Chemtrailers, he explains, explicitly advise their followers to use scientific terms in their videos to make them seem more reliable.

“Within the scientific community, ‘geoengineering’ describes technology with the potential to deal with the serious consequences of climate change, if we don’t manage to reduce greenhouse gases successfully. For example, greenhouse gas removal, solar radiation management or massive forestation to absorb carbon dioxide,” explains Allgaier.

“However, people searching for ‘geoengineering’ or ‘climate modification’ on YouTube won’t find any information on these topics in the way they are discussed by scientists and engineers. Instead, searching for these terms results in videos that leave users exposed to entirely non-scientific video content.”

Some of the conspiracy videos Allgaier found were monetized via adverts or through the sale of merchandise with conspiracy-theory motives. This made him question whether YouTube’s search algorithms help direct traffic towards this ‘dubious’ content. The way these algorithms work “is not very transparent,” he says, arguing that “YouTube should take responsibility to ensure its users will find high-quality information if they search for scientific and biomedical terms, instead of being exposed to doubtful conspiracy videos.”

Allgaier suggests that scientists and science communicators should seriously consider YouTube as a platform for sharing scientific information.

“YouTube has an enormous reach as an information channel, and some of the popular science YouTubers are doing an excellent job at communicating complex subjects and reaching new audiences,” he explains.

“Scientists could form alliances with science-communicators, politicians and those in popular culture in order to reach out to the widest-possible audience. They should speak out publicly about their research and be transparent in order to keep established trustful relationships with citizens and society.”

The paper “Science and Environmental Communication via Online Video: Strategically Distorted Communications on Climate Change and Climate Engineering on YouTube” has been published in the journal  Frontiers in Communication.

Youtube will show fact checking for sensitive topics like vaccination

Youtube will start showing “information boxes” — brief fact-checking bits that debunk some of the most common and dangerous scientific misconceptions. In some areas, Youtube is already showing this type of box for conspiracy theories; now, it will be expanding it to other types of sensitive topics like vaccination.

Example of Youtube’s new feature in action. Note the snippet below the video.

The feature is currently available in India, one of Youtube’s largest markets with over 250 million users. Millions of people in India are gaining access to the internet for the first time — this, coupled with cheap data plans, have brought forth an explosion in video streaming. People are more prone to rely on video streaming instead of reading or using other social media, using Youtube as a search engine and binging everything they come across.

The video explosion has brought forth an explosion of disinformation. In the US, Youtube now shows Wikipedia excerpts next to conspiracy theory videos — a necessary step, but still a small step in the grand scheme of things. Now, the Google-owned video giant wants to take another small step and include fact-checking boxes next to misleading videos on sensitive topics.

“As part of our ongoing efforts to build a better news experience on YouTube, we are expanding our information panels to bring fact checks from eligible publishers to YouTube,” a spokesperson told BuzzFeed News.

Example of Youtube’s fact-checking box. Image credits: Youtube via Buzzfeed.

According to Buzzfeed, several human fact-checking services have already been contacted by Youtube. Some of these services are already working with Facebook to curb disinformation. The tech giants are resorting to human oversight as at least so far, automated fact-checking has proven incapable of being doing a reliable job.

It’s not entirely clear what type of video this will appear next to. The spokesperson cited the recent conflict between India and Pakistan as a potential source of misinformation. As an example, he showed an explosion in Syria that someone was trying to pass off as a Pakistani attack against Indian soldiers.

However, the company has still not announced a timeline where a similar feature will be released in the rest of the world. Considering that it’s a service provided manually, scalability seems like a major issue. Misleading videos (and misleading information in general) have emerged as a major issue worldwide in the era of post-truth, one which social media giants are partially responsible for — and one for which there is no clear solution in sight.

Youtube has also recently announced the demonetization of all anti-vax channels, citing broader policy against the monetization of videos with “dangerous and harmful” content.

Youtube now adds facts below conspiracy theory videos

Well, well, well — if it isn’t the government trying to silence the truth! That was, at least in the way your truly intended it, sarcasm; but that’s also the way conspiracy theorists will likely interpret this move.

Example of Youtube’s new feature in action. Note the snippet below the video.

The internet has a truthfulness problem, and misinformation is spreading faster than ever. In this brave new post-truth world, separating fact from fiction is becoming increasingly difficult, and the conspiracy theorists (whether manipulative or just ill-informed) are becoming better and better at hiding the truth and replacing it with their own version.

Conspiracy theories have presumably been around since the dawn of man, and they’re still constantly lurking in the shadows or our day to day lives. Who hasn’t heard of alien abductions, mysterious brainwashing experiments, or more recently, vaccines causing autism? We see them in the movies, we read about it in the press, and of course, we hear about them on social media.

Social media has proven the perfect platform to spread conspiracy theories, and Youtube is probably the king. It’s not that Youtube is necessarily doing something wrong, but it’s all about the very nature of the platform. Videos can be much more convincing than a written article or a photo, and it’s much easier to convey the narrative you want to in a video. Also, Youtube recommends related videos, so it’s easy to go on a conspiracy-theory binge from video to video. The website is even kind enough to recommend other videos based on your past preference — what’s not to like? It’s the best place to start going down the conspiracy rabbit hole. But Youtube, it seems, has had enough of it.

Youtube has recently started adding fact-confirming text below videos about climate change. A Wikipedia snippet simply reads “multiple lines of scientific evidence show that the climate system is warming.” It’s quite funny to hear videos going on and on about how global warming is a fraud, only to see the Wikipedia snippet bluntly refuting everything below the video. YouTube is also using Encyclopedia Britannica as a source for facts.

The company has been quite secretive about what sources it uses and what conspiracy theories it tries to address, but Wikipedia themselves have written about this. Here are some of them, but be warned — some of them are quite weird.

 

 

Youtube have also invested $25 million in grants to news organizations that wish to expand video operations and combat fake news and conspiracy theories, in line with their parent company’s approach (Google has also started a digital news initiative to promote quality journalism and tackle fake news).

The feature is not available in all countries, and Youtube hasn’t announced where it is available. They did say, however, that an algorithm is deciding on what videos — videos are not manually tagged. The video uploaders have not been notified by this change and if their videos are targeted by this. Of course, this new feature has left uploaders of conspiracy videos fuming. YouTuber Tony Heller, who also uploads climate change denial videos, described the policy on Twitter as YouTube “putting propaganda at the bottom of all climate videos.” However, the move was praised by scientists, with climate scientist Michael Mann likening it to the warning label on a pack of cigarettes: “Warning — this video may or may not be promoting actual facts about climate change.”

Will this be a wrench in the wheels of conspiracy theories, or will it simply be ignored and business will continue as usual? It will be interesting to see. For now, only time will tell.

Google teams up with Stephen Hawking and launches teenage space experiment contest

Two lucky winners will have their space experiments performed in space, on the International Space Station by the astronauts stationed there.

Two lucky winners will have their space experiments performed in space, on the International Space Station by the astronauts stationed there.

The most popular video sharing website in the world, YouTube, has teamed up with NASA and several other key figures from the scientific community to launch YouTube Space Lab, a global effort challenging students between the ages of 14 and 18 to design an experiment that can be conducted in space. I know there are a lot of teenaged readers here, so this competition might be your best chance to devise an experiment trully out of this world. Read on!

Curiously enough, Google’s chairman Eric Schmidt recently criticised science and technology education in the UK, after he delivered the annual McTaggart lecture in Edinburgh, when he said the country needed to reignite children’s passion for subjects such as engineering and maths. Then came this press announcement from YouTube, which is owned by Google, publicizing this highly bold competition.

Only two winning entries will be selected from all the entries, both of which will be performed by the International Space Station astronauts. The goal is to engage students in science, engineering and math, and to help them develop their creative and analytical faculties, officials said.

“The space station really is the greatest science classroom we have,” said former astronaut Leland Melvin, associate administrator for education at NASA headquarters in Washington, D.C., in a statement. “This contest will capitalize on students’ excitement for space exploration while engaging them in real-life scientific research and experimentation.”

To enter the competition, students must submit a two-minute video application explaining their experiment, of course, on YouTube by Dec. 7. You can choose They can work alone or in groups of up to three people. Students can submit up to three experiments in one of two disciplines — biology or physics.

The winners, besides having the honour of having their experiment run in space, will get to experience weightlessness on a zero-g airplane flight, and have the option to either undergo astronaut training in Russia, or to watch the rocket launch in Japan which takes their idea into space.

The top 60 experiments will be announced on Jan. 3, 2012, at which time final judging will begin. The judge panel is quite stellar, as one might expect – renowned astrophysicist Stephen Hawking and NASA’s human exploration and operations chief, Bill Gerstenmaier, Frank De Winne, Akihiko Hoshide and noted “space tourist” Guy Laliberté.

For more information on the contest and how to enter, visit: http://www.youtube.com/SpaceLab.

YouTube releases live broadcasting service

Although the Google supervised social video sharing website YouTube has been dabbling for years now with live broadcasting, most of the time with concerts (U2 events mostly) or political ralleys (the CitizenTube should be famous enough by now for you to recognize the example), it’s only recently that the company has announced that they’ve officially rolled out Youtube Live – a new feature that allows select content partners to stream live video around the clock.

In an official blog post, YouTube stated that “The goal is to provide thousands of partners with the capability to live stream from their channels in the months ahead. In order to ensure a great live stream viewing experience, we’ll roll this offering out incrementally over time.”

Live streaming videos also allow users to comment on the video in real time with one another. Live channel owners can also schedule specific events, which users can select to receive notifications on their Youtube home page.

For more info and insights visit youtube.com/live.