Get a korean lol account
![get a korean lol account get a korean lol account](https://storage.qoo-static.com/game/9931/6ISujcOmGFvg7cfvLieMitbxsexjfnsr.png)
But as the march towards always-on surveillance continues in travel hubs and major cities, using members of the public as guinea pigs for AI development is probably going to become just as routine as the numerous, formerly-novel, impositions placed on travelers shortly after the 9/11 attacks. It's pretty sad when democratic governments decide the people belong to the government, rather than the other way around. “Internationally, it is difficult to find any precedent of actual immigration data from domestic and international travelers being provided to companies and used for AI development without any notification or consent,” said Chang Yeo-Kyung, executive director of the Institute for Digital Rights. Precedent isn't on the government's side. It's the distribution of the collected images, which no travelers expressly agreed to. The legal basis for the collection isn't being challenged. Shortly after the discovery, civil liberty groups announced plans to represent both foreign and domestic victims in a lawsuit. Lawsuits are coming, though, according to Motherboard. And there's nothing preventing foreign citizens from suing the South Korean government, even though this action can sometimes be considerably more expensive than suing locally. But that still leaves nearly 58 million images of its own citizens.
![get a korean lol account get a korean lol account](https://static.invenglobal.com/upload/image/2020/01/16/i1579184872488097.jpeg)
With two-thirds of the freebie images being of foreigners, perhaps the South Korean government thought it would lower its incoming litigation footprint.
![get a korean lol account get a korean lol account](https://3.bp.blogspot.com/-oPiAByhAhtI/U2yq-y_OSuI/AAAAAAAAA1Q/58SEV_ToIdA/s1600/Reduce-Ping-To-Play-LOL-Korean-Server-With-VPN-5.jpg)
Of the facial data transferred from the MOJ for use by private companies last year as part of this project, around 120 million images were of foreign nationals.Ĭompanies used 100 million of these for “AI learning” and another 20 million for “algorithm testing.” The MOJ possessed over 200 million photographs showing the faces of approximately 90 million foreign nationals as of 2018, meaning that over half of them were used for learning. This is from South Korean news agency Hankyoreh, which broke the story: Maybe the government felt this was okay because most of the images were of non-citizens. National Assembly member Park Joo-min requested information from the Ministry about its "Artificial Intelligence and Tracking System Construction" project and received this bombshell in return. It took another government employee to deliver the shocking news. The public was never informed of this by the Ministry of Justice. To accomplish this, it apparently decided the private sector should take everything cameras had collected so far and use those images to train facial recognition AI. The MOJ is in the process of obtaining better facial recognition tech to arm its hundreds of airport cameras with. The project - one with millions of unaware participants - began in 2019. “It’s unheard-of for state organizations-whose duty it is to manage and control facial recognition technology-to hand over biometric information collected for public purposes to a private-sector company for the development of technology,” six civic groups said during a press conference last week. While the use of facial recognition technology has become common for governments across the world, advocates in South Korea are calling the practice a “human rights disaster” that is relatively unprecedented. Ironically enough, South Korean privacy activists (as well as some of the millions contained in the database) say this action is exactly the opposite of "justice."
![get a korean lol account get a korean lol account](https://pbs.twimg.com/media/E-RMQCDVgAQ5QS6.jpg)
The agency carelessly handing out millions of facial images to private tech companies was the country's Ministry of Justice. The South Korean government handed over roughly 170 million photographs showing the faces of South Korean and foreign nationals to the private sector without their consent, ostensibly for the development of an artificial intelligence (AI) system to be used for screening people entering and leaving the country, it has been learned. That's what happened in South Korea, where facial images (mostly of foreign nationals) were bundled up and given to private parties without ever informing travelers this had happened (or, indeed, would be happening). What's not expected is that the millions of images gathered by hundreds of cameras will just be handed over to private tech companies by the government that collected them. Rolled out by government agencies with minimal or no public input and deployed well ahead of privacy impact assessments, airports around the world are letting people know they can fly anywhere as long as they give up a bit of their freedom. Often installed under the assumption that collecting the biometric data of millions of non-terrorist travelers will prevent more terrorism, the systems are just becoming another bullet point on the list of travel inconveniences. Facial recognition systems are becoming an expected feature in airports.