Still using Zoom? Zoom won’t encrypt free calls because it wants to comply with law enforcement If you’re a free Zoom user, and waiting for the company to roll out end-to-end encryption for better protection of your calls, you’re out of luck. Free calls won’t be encrypted, and law enforcement will be able to access your information in case of ‘misuse’ of the platform. Zoom CEO Eric Yuan today said that the video conferencing app’s upcoming end-to-end encryption feature will be available to only paid users. After announcing the company’s financial results for Q1 2020, Yuan said the firm wants to keep this feature away from free users to work with law enforcement in case of the app’s misuse: "Free users, for sure, we don’t want to give that [end-to-end encryption]. Because we also want to work it together with FBI and local law enforcement, in case some people use Zoom for bad purpose." In the past, platforms with end-to-end encryption, such as WhatsApp, have faced heavy scrutiny in many countries because they were unable to trace the origins of problematic and misleading messages. Zoom likey wants to avoid being in such a position, and wants to comply with local laws to keep operating across the globe. Alex Stamos, working as a security consultant with Zoom, said it wants to catch repeat offenders for hate speech or child exploitative content by not offering end-to-end encryption t0 free users. In March, The Intercept published a report stating that the company doesn’t use end-to-end encryption, despite claiming that on its website and security white paper. Later, Zoom apologized and issued a clarification to specify it didn’t provide the feature at that time. Last month, the company acquired Keybase.io, an encryption-based identity service, to build its end-to-end encryption offering. Yuan said today that the company got a lot of feedback from users on encryption, and it’s working out on executing it. However, he didn’t specify a release date for the feature. According to the Q1 2020 results, the company grew 169% year-on-year in terms of revenue. Zoom has more than 300 million daily participants attending meetings through the platform.

Related Feeds

Galaxy Note 20 Plus leaks! renders suggest a slightly bigger screen and much bigger camera bump Last year’s Galaxy Note 10 Plus was truly a thing of beauty, with an amazing screen and superlative industrial design. And, judging by some high-quality (and completely unofficial) renders of the Galaxy Note 20 Plus, it looks like Samsung won’t be changing too much in 2020. The only major difference is a new, bulkier camera module on the rear of the device. These renders come from noted leaker @OnLeaks in collaboration with phone-case maker Pigtou. As with all renders based on leaked CAD drawings, they should be taken with a pinch of salt, but the design they suggest for the Note 20 Plus seems reasonable enough. GRID VIEW 3 of 3 These renders are based on leaked CAD drawings. The most obvious feature is the big, nearly edge-to-edge display, with curved bezels, sloping sides, and a small, central hole-punch cutout for the selfie camera that’s near identical to the 2019 design. The top and bottom edge also look the same as that of the Note 10 Plus, with speaker grille, charging port, and space for the signature S Pen stylus. In terms of size, the Note 20 Plus will reportedly be slightly bigger than the 10 Plus, with a 6.9-inch display instead of last year’s 6.8-inch AMOLED screen, and slightly longer but thinner dimensions, with the same overall width (165mm long, 77.2mm wide, and 7.6mm thick, according to leaks). The most noticeable change, though, is the camera module on the rear of the device, which is much bigger than that of the 2019 Note Plus, according to the renders. In the Note 10 and Note 10 Plus this module was a pretty slim oval containing three lenses, while the flash module was positioned to one side, flush with the case (in the 10 Plus there are two additional divots that supply the phone’s depth-sensing capabilities). But the Note 20 renders show a larger, more rectangular module that apparently integrates all these components — lenses and assorted gubbins — into a single raised unit. This looks extremely similar to the camera system on this year’s Galaxy S20 Ultra, which was very much designed to show off Samsung’s photography abilities. The S20 Ultra’s module contained five lenses sporting up to 108 megapixels, and a 4x optical zoom that offered solid results up to a 10x zoom with the help of software. There’s a lot to say about that camera system, but check out our full review from February for more information. If the Note 20 and 20 Plus are anything like previous iterations in the series, they’ll contain the best, biggest, and brightest of Samsung’s smartphone specs. So it’s not unreasonable to expect some high-end camera hardware. We’ve not seen any leaked specs for the 2020 Notes, though, so will have to see what surfaces in the months to come. Reports suggest that Samsung will be launching the Note 20 in August, though this will likely be an online-only event as a reaction to the ongoing COVID-19 Pandemic. youtu.be/pS8bErtjIDQ

Complete look of Samsung Galaxy Note 20 Plus (based on leaked CAD drawings)

Yup, Facebook and Mark are evil. Facebook reportedly ignored its own research showing algorithms divided users An internal Facebook report presented to executives in 2018 found that the company was well aware that its product, specifically its recommendation engine, stoked divisiveness and polarization, according to a new report from The Wall Street Journal. Yet, despite warnings about the effect this could have on society, Facebook leadership ignored the findings and has largely tried to absolve itself of responsibility with regard to partisan divides and other forms of polarization it directly contributed to, the report states. The reason? Changes might disproportionately affect conservatives and might hurt engagement, the report says. “Our algorithms exploit the human brain’s attraction to divisiveness,” one slide from the presentation read. The group found that if this core element of its recommendation engine were left unchecked, it would continue to serve Facebook users “more and more divisive content in an effort to gain user attention & increase time on the platform.” A separate internal report, crafted in 2016, said 64 percent of people who joined an extremist group on Facebook only did so because the company’s algorithm recommended it to them, the WSJ reports. FACEBOOK FOUND THAT ITS ALGORITHMS WERE PUSHING PEOPLE TO JOIN EXTREMEST ORGANIZATIONS Leading the effort to downplay these concerns and shift Facebook’s focus away from polarization has been Joel Kaplan, Facebook’s vice president of global public policy and former chief of staff under President George W. Bush. Kaplan is a controversial figure in part due to his staunch right-wing politics — he supported Supreme Court Justice Brett Kavanaugh throughout his nomination — and his apparent ability to sway CEO Mark Zuckerberg on important policy matters. Kaplan has taken on a larger role within Facebook since the 2016 election, and critics say his approach to policy and moderation is designed to appease conservatives and stave off accusations of bias. Kaplan, for instance, is believed to be partly responsible for Facebook’s controversial political ad policy, in which the company said it would not regulate misinformation put forth in campaign ads by fact-checking them. He’s also influenced Facebook’s more hands-off approach to speech and moderation over the last few years by arguing the company doesn’t want to seem biased against conservatives. The Wall Street Journal says Kaplan was instrumental in weakening or entirely killing proposals to change the platform to promote social good and reduce the influence of so-called “super-sharers,” who tended to be aggressively partisan and, in some cases, so hyper-engaged that they might be paid to use Facebook or might be a bot. Yet, Kaplan pushed back against some of the proposed changes — many of which were crafted by News Feed integrity lead Carlos Gomez Uribe — for fear they would disproportionately affect right-wing pages, politicians, and other parts of the user base that drove up engagement. One notable project Kaplan undermined was called Common Ground, which sought to promote politically neutral content on the platform that might bring people together around shared interests like hobbies. But the team building it said it might require Facebook take a “moral stance” in some cases by choosing not to promote certain types of polarizing content and that the effort could harm overall engagement over time, the WSJ reports. The team has since been disbanded. In a statement, a Facebook spokesperson tells The Verge, “We’ve learned a lot since 2016 and are not the same company today. We’ve built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform’s impact on society so we continue to improve. Just this past February we announced $2M in funding to support independent research proposals on polarization.”

Just stop using FB's WhatsApp, if you care about privacy Signal announces new face-blurring tool for Android and iOS Encrypted messaging app Signal has announced a new face-blurring tool that will be incorporated into the latest Android and iOS versions of the software. Users sharing pictures through the app will be able to quickly blur faces, adding another layer of privacy to pictures, though not necessarily hiding the subject’s identity completely. In a blog post announcing the update, Signal co-founder Moxie Marlinspike linked the update to the worldwide protests against racism and police violence sparked by the killing of George Floyd by law enforcement. These protests have led to record downloads for Signal, which uses end-to-end encryption to make messages harder to intercept. “We’ve also been working to figure out additional ways we can support everyone in the street right now,” writes Marlinspike. “One immediate thing seems clear: 2020 is a pretty good year to cover your face.” When you take a picture through Signal and select the Blur option in the toolbar, the app will automatically detect any faces it spots in your image. If it misses any, users can simply blur out faces by hand, or blur any other features they want to hide. All processing is done on-device, meaning uncensored images never leave the user’s phone. Although blurring faces in photographs certainly makes pictures more private, it’s by no means a foolproof way of anonymizing images and hiding someone’s identity. Some blurring and pixellation methods can be reversed with the right tools, for example. And anyone seeking to identify someone in a picture can work from other information, such as clothing and tattoos, which can be compared with other, un-blurred images. Even if attendees at a protest, for example, hide the identity of fellow protestors, that doesn’t mean other groups and individuals will do the same. Surveillance cameras, police body cameras, and press photographers are all capturing images. Ultimately, the best way to obscure your identity is to take matters into your own hands and wear a mask.

All Things Tech

A group for the nerds.

All Things Tech

Hinsken

27 Followers

1 Followings

95 feeds

Hinsken

DMC

0 Followers

0 Followings

108 feeds

DMC

India

Maharashtra

Nagpur District

Nagpur Urban Taluka

Nagpur

Bharat Nagar