The truth and nothing but the (Cronie Google regime propaganda) truth.
Googles new search engine model will base its search engines on the "truthful content" of the site. But whose truth? The regime political propaganda truth or actual facts truth? Take a wild guess. Opposing climate change views, opposition political facts, opposition speech and anything else which does not agree with Pravda and the likes will be all but impossible to find through the new search engine model. It may all be buried under the "New Google". "You shall Google but you shall not find" <-- Sounds like a great new slogan doesn't it? So if Google believes we must pass illegal immigration = Truth. You shall find all the sites agreeing with that. Is the regime funding arms deals with terrorist sponsoring countries? Only the propaganda sites denying such activities shall be found. Is Benghazi a cover up? Only sites denying that there is a cover up shall be found, for truth is in the eye of the beholder in America these days. If this doesn't alarm you, then no hay problema. Naturally they will make this censorship ploy appear as for the good of the peoples republic, just as they do in China and Iran and other countries, even England has such plans. A world wide "truthful content" based search engine. And who never lies to you? The government, naturally.
Google Develops System For Ranking ‘Trustworthiness’ Of Websites
http://www.breitbart.com/big-government ... -websites/
by John Hayward2 Mar 20150
A report at NewScientist describes a research paper from a Google team as presenting a “fix” for the spread of “garbage” across the Internet: an algorithm that would rank web pages based on their “trustworthiness” by automatically detecting and tabulating “false facts” on each web page.
Like every other pretense of calculating Objective Truth with a formula – or “fact-checking” the Internet with a team of supposedly disinterested and unbiased clergy of truth-seekers – it’s a concept brimming with the potential for abuse. The ink isn’t even dry on the government takeover of the Internet, and we’re already setting up office space for the Ministry of Truth? Everything really does happen faster on the Internet.
The “problem” addressed by the research paper NewScientist references is that Google’s search algorithm currently ranks websites based on their popularity, using “the number of incoming links to a web page as a proxy for quality.” The drawback to this approach is that “websites full of misinformation can rise up the rankings, if enough people link to them.”
Such rankings can even be influenced by a large number of links from Internet users seeking to challenge the claims made on the page, which is one reason it’s become increasingly common practice to link to third-party references to a disputed page. This, in turn, can create self-reinforcing rings of mistaken skepticism, in which those who challenge a website link only to each other, circulating increasingly inaccurate citations of the original page that was challenged… and perhaps denying readers an opportunity to see clarifications, updates, or retractions made on the original page.
The pursuit of Objective Truth is a difficult business, and it will probably continue to stubbornly resist even the most well-meaning efforts to automate it. Not all of those efforts are well-meaning. The NewScientist post approvingly references a few “fact-check” websites that have themselves been rocked by devastating challenges to their impartiality and accuracy.
Some self-described “fact-check” sites are outright jokes. Google’s “Knowledge Vault,” the prospective source of pure and undiluted truth for trustworthiness rankings, is described as a “vast store” of facts validated by the near-unanimous agreement of the Web. Gee, what could go wrong with that?
The temptation for self-appointed Gatekeepers of Truth, especially one as powerful as Google, to fudge their sacred (and enormously complex) truth-detecting formula would be enormous. Even if the formula is kept pure and delivers initially sound results, it could be corrupted by inputting false data, or manipulated by writers who learn how to beat its tests.
Over time, it’s not unreasonable to assume that the websites most commonly beaten down in the rankings due to “trustworthiness” errors would be those written by people who haven’t carefully studied the trustworthiness algorithm and learned how to play games with it.
Then there’s the matter of the Devil’s favorite sort of deception: the “half-truth,” a false claim packed with valid, but insufficient, nuggets of fact. Inferences are difficult to mechanically evaluate. An algorithm designed to detect assertions that run contrary to verified data isn’t going to detect truth left undelivered, context that isn’t properly established, or contrary evidence conveniently left unmentioned.
The level of confidence associated with automatically tabulated “trustworthiness” rankings seems likely to exceed the actual trustworthiness of the pages, as most people understand the meaning of that term. Cleaning up the “garbage” on the Internet would involve a lot more than reducing the search-engine priority of a few highly popular but empirically incorrect web pages; there is danger in persuading users to believe that such measures are sufficient.
There’s also danger in asserting the power to do such things automatically, without an opt-in from users. (If this “ranking by trustworthiness” concept gets past the theoretical stage, perhaps Google will implement it with such an opt-in. Big Internet companies have been burned a few times in the recent past by public outcry over the subtle manipulation of their behavior by stealthy changes to their Web experience, made without explicit user awareness and consent.) A lot seems to be happening to us “automatically” these days; some of these systems are accepted as helpful, while others increase the sense of unease that average end users have lost control of the Internet.
There’s a considerable paradigm shift involved in this ranking-by-facts concept, as it would transform the ranking of web pages from an external process controlled by the great and unruly mass of users – who make pages popular by linking to them – into an internal procedure controlled by Google, and those who learn how to take advantage of its system. Not that existing search algorithms are impossible to manipulate, of course – far from it! – but that transition to internal control is something users might want to ponder at length before signing on to it, assuming they are given the choice of not singing on.
Here are a few more for those who care enough to read what's ahead for us.
Google to Bias Search Engine Based on ‘Facts’ http://www.nationalreview.com/corner/41 ... ey-j-smith
And another article
Google wants to rank websites based on facts not links
* 28 February 2015 by Hal Hodson
* Magazine issue 3010. Subscribe and save
The trustworthiness of a web page might help it rise up Google's rankings if the search giant starts to measure quality by facts, not just links
THE internet is stuffed with garbage. Anti-vaccination websites make the front page of Google, and fact-free "news" stories spread like wildfire. Google has devised a fix – rank websites according to their truthfulness.
Google's search engine currently uses the number of incoming links to a web page as a proxy for quality, determining where it appears in search results. So pages that many other sites link to are ranked higher. This system has brought us the search engine as we know it today, but the downside is that websites full of misinformation can rise up the rankings, if enough people link to them.
A Google research team is adapting that model to measure the trustworthiness of a page, rather than its reputation across the web. Instead of counting incoming links, the system – which is not yet live – counts the number of incorrect facts within a page. "A source that has few false facts is considered to be trustworthy," says the team (arxiv.org/abs/1502.03519v1). The score they compute for each page is its Knowledge-Based Trust score.
The software works by tapping into the Knowledge Vault, the vast store of facts that Google has pulled off the internet. Facts the web unanimously agrees on are considered a reasonable proxy for truth. Web pages that contain contradictory information are bumped down the rankings.
There are already lots of apps that try to help internet users unearth the truth. LazyTruth is a browser extension that skims inboxes to weed out the fake or hoax emails that do the rounds. Emergent, a project from the Tow Center for Digital Journalism at Columbia University, New York, pulls in rumours from trashy sites, then verifies or rebuts them by cross-referencing to other sources.
LazyTruth developer Matt Stempeck, now the director of civic media at Microsoft New York, wants to develop software that exports the knowledge found in fact-checking services such as Snopes, PolitiFact and FactCheck.org so that everyone has easy access to them. He says tools like LazyTruth are useful online, but challenging the erroneous beliefs underpinning that information is harder. "How do you correct people's misconceptions? People get very defensive," Stempeck says. "If they're searching for the answer on Google they might be in a much more receptive state."
This article appeared in print under the headline "Nothing but the truth"
Googles new search engine model will base its search engines on the "truthful content" of the site. But whose truth? The regime political propaganda truth or actual facts truth? Take a wild guess. Opposing climate change views, opposition political facts, opposition speech and anything else which does not agree with Pravda and the likes will be all but impossible to find through the new search engine model. It may all be buried under the "New Google". "You shall Google but you shall not find" <-- Sounds like a great new slogan doesn't it? So if Google believes we must pass illegal immigration = Truth. You shall find all the sites agreeing with that. Is the regime funding arms deals with terrorist sponsoring countries? Only the propaganda sites denying such activities shall be found. Is Benghazi a cover up? Only sites denying that there is a cover up shall be found, for truth is in the eye of the beholder in America these days. If this doesn't alarm you, then no hay problema. Naturally they will make this censorship ploy appear as for the good of the peoples republic, just as they do in China and Iran and other countries, even England has such plans. A world wide "truthful content" based search engine. And who never lies to you? The government, naturally.
Google Develops System For Ranking ‘Trustworthiness’ Of Websites
http://www.breitbart.com/big-government ... -websites/
by John Hayward2 Mar 20150
A report at NewScientist describes a research paper from a Google team as presenting a “fix” for the spread of “garbage” across the Internet: an algorithm that would rank web pages based on their “trustworthiness” by automatically detecting and tabulating “false facts” on each web page.
Like every other pretense of calculating Objective Truth with a formula – or “fact-checking” the Internet with a team of supposedly disinterested and unbiased clergy of truth-seekers – it’s a concept brimming with the potential for abuse. The ink isn’t even dry on the government takeover of the Internet, and we’re already setting up office space for the Ministry of Truth? Everything really does happen faster on the Internet.
The “problem” addressed by the research paper NewScientist references is that Google’s search algorithm currently ranks websites based on their popularity, using “the number of incoming links to a web page as a proxy for quality.” The drawback to this approach is that “websites full of misinformation can rise up the rankings, if enough people link to them.”
Such rankings can even be influenced by a large number of links from Internet users seeking to challenge the claims made on the page, which is one reason it’s become increasingly common practice to link to third-party references to a disputed page. This, in turn, can create self-reinforcing rings of mistaken skepticism, in which those who challenge a website link only to each other, circulating increasingly inaccurate citations of the original page that was challenged… and perhaps denying readers an opportunity to see clarifications, updates, or retractions made on the original page.
The pursuit of Objective Truth is a difficult business, and it will probably continue to stubbornly resist even the most well-meaning efforts to automate it. Not all of those efforts are well-meaning. The NewScientist post approvingly references a few “fact-check” websites that have themselves been rocked by devastating challenges to their impartiality and accuracy.
Some self-described “fact-check” sites are outright jokes. Google’s “Knowledge Vault,” the prospective source of pure and undiluted truth for trustworthiness rankings, is described as a “vast store” of facts validated by the near-unanimous agreement of the Web. Gee, what could go wrong with that?
The temptation for self-appointed Gatekeepers of Truth, especially one as powerful as Google, to fudge their sacred (and enormously complex) truth-detecting formula would be enormous. Even if the formula is kept pure and delivers initially sound results, it could be corrupted by inputting false data, or manipulated by writers who learn how to beat its tests.
Over time, it’s not unreasonable to assume that the websites most commonly beaten down in the rankings due to “trustworthiness” errors would be those written by people who haven’t carefully studied the trustworthiness algorithm and learned how to play games with it.
Then there’s the matter of the Devil’s favorite sort of deception: the “half-truth,” a false claim packed with valid, but insufficient, nuggets of fact. Inferences are difficult to mechanically evaluate. An algorithm designed to detect assertions that run contrary to verified data isn’t going to detect truth left undelivered, context that isn’t properly established, or contrary evidence conveniently left unmentioned.
The level of confidence associated with automatically tabulated “trustworthiness” rankings seems likely to exceed the actual trustworthiness of the pages, as most people understand the meaning of that term. Cleaning up the “garbage” on the Internet would involve a lot more than reducing the search-engine priority of a few highly popular but empirically incorrect web pages; there is danger in persuading users to believe that such measures are sufficient.
There’s also danger in asserting the power to do such things automatically, without an opt-in from users. (If this “ranking by trustworthiness” concept gets past the theoretical stage, perhaps Google will implement it with such an opt-in. Big Internet companies have been burned a few times in the recent past by public outcry over the subtle manipulation of their behavior by stealthy changes to their Web experience, made without explicit user awareness and consent.) A lot seems to be happening to us “automatically” these days; some of these systems are accepted as helpful, while others increase the sense of unease that average end users have lost control of the Internet.
There’s a considerable paradigm shift involved in this ranking-by-facts concept, as it would transform the ranking of web pages from an external process controlled by the great and unruly mass of users – who make pages popular by linking to them – into an internal procedure controlled by Google, and those who learn how to take advantage of its system. Not that existing search algorithms are impossible to manipulate, of course – far from it! – but that transition to internal control is something users might want to ponder at length before signing on to it, assuming they are given the choice of not singing on.
Here are a few more for those who care enough to read what's ahead for us.
Google to Bias Search Engine Based on ‘Facts’ http://www.nationalreview.com/corner/41 ... ey-j-smith
And another article
Google wants to rank websites based on facts not links
* 28 February 2015 by Hal Hodson
* Magazine issue 3010. Subscribe and save
The trustworthiness of a web page might help it rise up Google's rankings if the search giant starts to measure quality by facts, not just links
THE internet is stuffed with garbage. Anti-vaccination websites make the front page of Google, and fact-free "news" stories spread like wildfire. Google has devised a fix – rank websites according to their truthfulness.
Google's search engine currently uses the number of incoming links to a web page as a proxy for quality, determining where it appears in search results. So pages that many other sites link to are ranked higher. This system has brought us the search engine as we know it today, but the downside is that websites full of misinformation can rise up the rankings, if enough people link to them.
A Google research team is adapting that model to measure the trustworthiness of a page, rather than its reputation across the web. Instead of counting incoming links, the system – which is not yet live – counts the number of incorrect facts within a page. "A source that has few false facts is considered to be trustworthy," says the team (arxiv.org/abs/1502.03519v1). The score they compute for each page is its Knowledge-Based Trust score.
The software works by tapping into the Knowledge Vault, the vast store of facts that Google has pulled off the internet. Facts the web unanimously agrees on are considered a reasonable proxy for truth. Web pages that contain contradictory information are bumped down the rankings.
There are already lots of apps that try to help internet users unearth the truth. LazyTruth is a browser extension that skims inboxes to weed out the fake or hoax emails that do the rounds. Emergent, a project from the Tow Center for Digital Journalism at Columbia University, New York, pulls in rumours from trashy sites, then verifies or rebuts them by cross-referencing to other sources.
LazyTruth developer Matt Stempeck, now the director of civic media at Microsoft New York, wants to develop software that exports the knowledge found in fact-checking services such as Snopes, PolitiFact and FactCheck.org so that everyone has easy access to them. He says tools like LazyTruth are useful online, but challenging the erroneous beliefs underpinning that information is harder. "How do you correct people's misconceptions? People get very defensive," Stempeck says. "If they're searching for the answer on Google they might be in a much more receptive state."
This article appeared in print under the headline "Nothing but the truth"
://.soundclick/band/page_music.cfm?bandID=178805
Google You tube Slacker G Guitar skills (1&2)
The same spirit that ruled over Hitler is headed our way.
Let those with ears to hear understand.
Google You tube Slacker G Guitar skills (1&2)
The same spirit that ruled over Hitler is headed our way.
Let those with ears to hear understand.