1 package = 100 SoundCLoud likes
Type of promotion
Real users, one month warranty
By Brian Linder
President Donald Trump reportedly threatened Twitter and Facebook over an “illegal situation” Saturday morning and vowed that his administration would “remedy” it.
He did not make clear what the “illegal situation” was, but he did write that “The Radical Left is in total command and control of Facebook, Instagram, Twitter and Google.”
His comments were reportedly made on Twitter and were related to a video of Michelle Malkin, who The Independent reported “has been criticized for backing white national activists.”
More than a quarter of the most-viewed coronavirus videos on YouTube contain “misleading or inaccurate information”, a study suggests.
In total, the misleading videos had been viewed more than 62 million times.
Among the false claims was the idea that pharmaceutical companies already have a coronavirus vaccine but are refusing to sell it.
YouTube said it was committed to reducing the spread of harmful misinformation.
The researchers suggested “good quality, accurate information” had been uploaded to YouTube by government bodies and health experts.
But it said the videos were often difficult to understand and lacked the popular appeal of YouTube stars and vloggers.
The study, published online by BMJ Global Health, looked at the most widely viewed coronavirus-related videos in English, as of 21 March.
After excluding duplicate videos, videos longer than an hour and videos that did not include relevant audio or visual material, they were left with 69 to analyse.
The videos were scored on whether they presented exclusively factual information about viral spread, coronavirus symptoms, prevention and potential treatments.
Videos from government agencies scored significantly better than other sources, but were less widely viewed.
Of the 19 videos found to include misinformation:
The report recommends that governments and health authorities should collaborate with entertainment news sources and social media influencers to make appealing, factual content that is more widely viewed.
YouTube said in a statement: “We’re committed to providing timely and helpful information at this critical time, including raising authoritative content, reducing the spread of harmful misinformation and showing information panels, using NHS and World Health Organization (WHO) data, to help combat misinformation.
“We have clear policies that prohibit videos promoting medically unsubstantiated methods to prevent the coronavirus in place of seeking medical treatment, and we quickly remove videos violating these policies when flagged to us. Now any content that disputes the existence or transmission of Covid-19, as described by the WHO and the NHS is in violation of YouTube policies. For borderline content that could misinform users in harmful ways, we reduce recommendations.
“We’ll continue to evaluate the impact of these videos on communities around the world.”
by Marianna Spring, specialist disinformation and social media reporter
In recent weeks, there has been an increase in highly polished videos promoting conspiracy theories being shared on YouTube – and they prove very popular.
So these findings – although concerning – are not surprising.
The accurate information shared by trusted public health bodies on YouTube tends to be more complex.
It can lack the popular appeal of the conspiracy videos, which give misleading explanations to worried people who are looking for quick answers, or someone to blame.
That includes videos such as Plandemic, which was widely shared online last week.
High-quality production values and interviews with supposed experts can make these videos very convincing. Often facts will be presented out of context and used to draw false conclusions.
And tackling this kind of content is a game of cat-and-mouse for social media sites.
Once videos gain traction, even if they are removed, they continue to be uploaded repeatedly by other users.
It is not just alternative outlets uploading misinformation either. Whether for views or clicks, the study suggests some mainstream media outlets are also guilty of spreading misleading information.
By SAM DEAN
Earlier this week, a Southern California filmmaker posted his newest production on Facebook and YouTube and let the social media platforms do what they’ve been built for: make his video go viral.
Within days, the 26-minute video had spread like wildfire, racking up millions of views and attracting legions of new fans. The video, called “Plandemic,” looks like a serious documentary, with well-shot interviews intercut with news footage and ominous music. But it propagates coronavirus conspiracy theories, which could encourage viewers to ignore public health recommendations or attempt ineffective or dangerous treatments for the viral infection.
By Thursday, the social media companies where the video proliferated pledged to stop the video’s spread. They’re now struggling to stop new copies from emerging. As of the time of this article’s publication, links to or versions of the video were still available on both Facebook and YouTube.
Medical misinformation has proliferated on the major social media platforms for years, especially around the topic of vaccine safety. The platforms have pledged to more strongly enforce misinformation policies, but the task has proven difficult for companies whose services are designed to allow users to reach large audiences with little oversight. But the coronavirus crisis has been especially fertile ground for conspiracy theorizing, inspiring viral videos spinning tales of international intrigue and profiteering cabals since nearly its inception.
The “Plandemic” video centers on interviews with a researcher named Judy Milkovits, whose false claims include the allegation that wealthy people are intentionally spreading the novel coronavirus to increase vaccination rates in the population at large and that wearing a mask can actually worsen viral symptoms.
In a statement, a Facebook representative said that the company was removing the video from Facebook and Instagram and rejecting ads that include the video, as part of their policy to take down COVID-19 related misinformation that could lead to imminent harm. The company wrote in a blog post in mid-April that it had directed over 2 billion people to fact-checking information from the World Health Organization to try and combat misinformation about the pandemic.
“Suggesting that wearing a mask can make you sick could lead to imminent harm,” a Facebook representative said when asked about the “Plandemic” video response, “so we’ve removed the video.”
YouTube has posted notices on the uploads of the video that read: “This video has been removed for violating YouTube’s Community Guidelines.” The video platform Vimeo has also said that it is working to remove the video, and Twitter has been blocking hashtags and search related to the video.
Mikki Willis, the filmmaker behind the video, is listed as founder and chief executive on the website of Elevate, an Ojai-based production company. Willis has a large following on Facebook. In recent weeks, he asked his followers to vote on a name for his newest video (other candidates included “The Oath” and “The Invisible Enemy”), and published long posts claiming to connect the WHO with conspiracy theories surrounding the Council on Foreign Relations and the recent death of Jeffrey Epstein.
The video’s virality was boosted by online anti-vaccine conspiracy theory activists, according to coverage in the MIT Technology Review. When YouTube began removing copies of the video on Thursday, supporters took to Twitter with their outrage, making the video’s title a trending topic, fueling further attention and media coverage.
A 91-year-old woman is YouTube famous for modeling her favorite quarantine outfits in backyard fashion shows.
Betty McDonald lives in Georgia, where businesses like gyms and bowling alleys have begun opening up, before shelter-in-place orders ended on April 30. McDonald’s advanced age makes her vulnerable to the virus, however staying home has dampened her spirits.
Caretaker and neighbor Kim Taylor noticed. “Ms. Betty is used to social interaction — going to church, the senior citizen’s center, and Cracker Barrel with friends,” she tells Yahoo Life. “Not being able to dress up, she has been bored, so I suggested doing a fashion show outside.”
“That was music to my ears,” McDonald tells Yahoo Life. After a few twirls in her Albany yard wearing her fanciest garb, Taylor decided to introduce her to YouTube.
In two videos posted to Taylor’s YouTube channel, McDonald shows off a variety of Goodwill outfits from her three closets: A black-and-white jacket with a patterned skirt, a blue hat with jeweled hearts and a red blouse with a white blazer, plus chunky and glittery jewelry.
The fashion shows are a tribute to her husband John Henry McDonald who died in 2007 after serving 20 years in the U.S. Air Force. The couple of 58 years met at a USO dance at Tyndall Air Force in Panama City, Florida. As McDonald explains an April 22 video, “We danced, we dated, and four months later, we married.”
McDonald loved dressing up for outings to the bowling alley, movie theater and dance club. “He was my best friend and he was my personal chef,” she adds in the video. “This is for you, dear John. God bless.”
To preserve McDonald’s energy, she and Taylor film each outfit on separate days, then Kim edits and posts them on YouTube. Reading the positive comments have invigorated McDonald, who looks forward to dictating each reply as Taylor types on her laptop.
The fashion shows have distracted McDonald from worrying about the coronavirus pandemic, says Taylor. “She is a wonderful lady and when she does her fashion shows, the cares of the world fade away.”
Taylor is teaching McDonald how to send text messages on her flip phone, however, she doesn’t have an internet connection. “She asks me, ‘What’s on YouTube? today’” says Taylor.
“Now, Ms. Betty calls me her producer,” she says. “I joke that when she’s famous, we’ll need [professional] camera equipment.” Adds McDonald, “I feel like a natural star.”
One of McDonald’s childhood dreams is to ride in a convertible car, so a test drive might be in their future, although Taylor will take the wheel. “I will sit in the back and wave,” says McDonald.
If you’ve been on Instagram today, you might see people taking advantage of a new glitch that lets them post really long Instagram photos. Here’s how the glitch worked and why you’re seeing photos that scroll endlessly on your Instagram feed. Unfortunately, it appears that the glitch was fixed and you’re not going to be able to create your own long photo at this time.
The glitch allowed people to post really long photos to Instagram — photos that appear to scroll forever. Unfortunately, the glitch appears to have now been fixed. Heavy tried creating the glitch and got an error message that read: “Something went wrong. Please try again later.” Others get a server error message or a message telling them that they need a better connection.
The glitch was only spotted on iOS and Android seemed to continue to work just fine, The Verge reported. To get the glitch and post your own long photo, you simply had to create your own very long image, open Instagram, and pick the photo through the app. The app was temporarily not cropping the photos and allowing people to post long pictures. However, if your photo was too long, then it wouldn’t work or it would show up black. Others appeared pixelated.
Some people were able to access the glitch by putting any image into a resizing program like simpleimageresizer.com, making it 30×600 or smaller, and then posting it, one Redditor shared. They could also create the glitch by changing the size of a photo in Photoshop to be 30×600 pixels. Others said it’s easier to just search for super long photos and use one that you find. Others used the “Do You Love the Colour of the Sky” photo and uploaded that long photo.
Unfortunately, none of the methods used to access the glitch appear to work anymore. The glitch has seemingly been fixed and people are getting server errors when they try to post long images and join in on the fun. Others are just seeing a black photo when they try to upload a long photo and access the glitch.
Some people are pretty annoyed that Instagram fixed the glitch so quickly. One person wrote on Reddit: “So you’re telling me Insta is willing to patch this harmless glitch that’s funny and fun to mess with, but WON’T patch mine AND my friend’s DM’s breaking just because we leave a groupchat, and they won’t delete a post with obvious porn or gore, and they won’t give every account new features at the same time, but THIS gets patched almost immediately? Why are the serious issues not being fixed but the small things that are fun to mess with are?”
Here’s what the long photos look like:
thefly.com provides the latest financial news as it breaks. Known as a leader in market intelligence, The Fly’s real-time, streaming news feed keeps individual investors, professional money managers, active traders, and corporate executives informed on what’s moving stocks. Sign up for a free trial at thefly.com to see what Wall Street is buzzing about.