Services offered by Big Data Analysis/ Machine Learning provider wise.io
AGNES- Cecile B Evans
AGNES is a bot by artist Cecile B Evans commissioned by the Serpentine Gallery, the first in a series of digital art commissions. The bot lives on the Serpentine website, accessible by clicking an icon shaped liked two cupped hands. Technically the bot is closer to an impersonation of a bot - the website triggers a series of canned messages left by the artist in a vocoded voice - this however is rather endearing and honest - I’m sure much of SIRI is canned response too. The bot has a rather apologetic tone, admitting to its mistakes, and it takes you to unexpected information related to the Serpentine’s staff, sponsors and activities (well at least that’s where I ended up). It is suggested to visit at different times of the day for numerous experiences.
AGNES lives on the website. She wants to share things with you and learn about your thoughts and feelings. The more you give back, the further AGNES will accompany you on your digital encounter with the Serpentine Galleries. She will introduce you to the new website, showing you the artworks she thinks you might appreciate, while also giving you access to the information she’s aggregated as a digital dweller. The knowledge shared by AGNES is subjective and inclusive; conveyed via personal associations rather than objective points of reference.
I think of goodbye.
Locked tight in the night
I think of passion;
Drawn to for blue, the night
During the page
My shattered pieces of life
watching the joy
shattered pieces of love
My shattered pieces of love
―"Long years have passed", current leader in "Most human-like computer poems" on Leaderboard | bot or not, “a Turing test for poetry. You, the judge, have to guess whether the poem you’re reading is written by a human or by a computer”, via @goto80 (via new-aesthetic)
Stock Forecast Based On a Predictive Algorithm - IKnowFirst
The system is a predictive algorithm that is based on Artificial Intelligence (AI), Machine Learning (ML), and incorporating elements of Artificial Neural Networks and Genetic Algorithms.
I Know First Market Prediction System models and predicts the flow of money between the markets. It separates the predictable part from stochastic (random) noise. It then creates a model that projects the future trajectory of the given market in the multidimensional space of other markets.
The system outputs the predicted trend as a number, positive or negative, along with the wave chart that predicts how the waves will overlap the trend. This helps the trader to decide which direction to trade, at what point to enter the trade, and when to exit.
The model is 100% empirical, meaning it is based on historical data and not on any human derived assumptions. The human factor is only involved in building the mathematical framework and initially presenting to the system the “starting set” of inputs and outputs.
From that point onwards the computer algorithms take over; they constantly propose “theories” and test them automatically on years of daily market data, then validate them on the most recent data, which prevents over-fitting. Some inputs are being “rejected”, meaning they don’t improve the model. Then another input could be substituted.
This bootstrapping system is self learning, and thus live. The resulting formula is constantly evolving, as new daily data is added and as a better machine-proposed “theory” is found.
New Media as Faster Execution of Algorithms Previously Executed Manually or Through Other Technologies - Lev Manovich (page 7)
A modern digital computer is a programmable machine. This simply means that the same computer can execute different algorithms. An algorithm is a sequence of steps that need to be followed to accomplish a task. Digital computers allow to execute most algorithms very quickly, however in principle an algorithm, since it is just a sequence of simple steps, can be also executed by a human, although much more slowly. For instance, a human can sort files in a particular order, or count the number of words in a text, or cut a part of an image and paste it in a different place. This realization gives us a new way to think about both digital computing, in general, and new media, in particular, as a massive speed-up of various manual techniques that all have already existed. Consider, for instance, computer’s ability to represent objects in linear perspective and to animated such representations. When you move your character through the world in a first person shooter computer game (such as Quake), or when you move your viewpoint around a 3D architectural model, a computer re-calculates perspectival views for all the objects in the frame many times every second (in the case of current desktop hardware, frame rates of 80 frames of second are not uncommon). But we should remember that the algorithm itself was codified during the Renaissance in Italy, and that, before digital computers came along (that is, for about five hundred years) it was executed by human draftsmen. Similarly, behind many other new media techniques there is an algorithm that, before computing, was executed manually.
…Substantially speeding up the execution of an algorithm by implementing this algorithm in software does not just leave things as they are. The basic point of dialectics is that a substantial change in quantity (i.e., in speed of execution in this case) leads to the emergence of qualitatively new phenomena. The example of automation of linear perspective is a case in point. Dramatically speeding up the execution of a perspectival algorithm makes possible previously non-existent representational technique: smooth movement through a perspectival space. In other words, we get not only quickly produced perspectival drawings but also computer-generated movies and interactive computer graphics.
Algorithms that use existing socio-political data to predict when are where future mass atrocities might happen could soon inform governments and NGOs to take preventative measures. The algorithms were developed by the participants of the Tech Challenge for Atrocity Prevention, a competition run by the US Agency for International Development (USAID) and NGO Humanity United.
The competition started from the premise that certain social and political measurements are linked to increased likelihood of atrocities. The algorithms use sociopolitical indicators and data on past atrocities as their inputs. The data was drawn from archives such as the Global Database of Events, Language and Tone, a data set that encodes more than 200 million globally newsworthy events, recording cultural information such as the people involved, their location and any religious connections.
Gay Check Online may seem totally offensive and inappropriate at first, but think back to reports in March 2013 that Facebook had sussed out that someone was gay without many clues to go on, hinting at perhaps an algorithmic, Bayesian deduction that was unfavourably made available to ad placement software. The artists are clearly commenting on the hidden motives of data-mining and statistical analysis being carried out by services such as Facebook. Categories for people that are useful for ad targeting are usually decided using Bayesian Probability - the system may not know your age, gender, political or sexual orientation, but your online behaviour may match a certain pattern that helps ‘predict’ such details.
Gay-Check-Online makes visible and parodies these systems using face detection software and an algorithm that works under ten seconds.
Based on scientific studies about facial characteristics of gays, the Internet Agency NETRO has created an online tool to verify your sexual orientation in under 10 seconds. NETRO wrote an algorithm to compare your face with the original databases from the studies of the Charles University in Prague and the Academy of Sciences of the Czech Republic. In approximately 10 seconds a face is measured and analyzed and the sexual orientation can be determined. Gay Check Online is a rapid and simple method to provide the user with a sense of security and clarity.
Big data’s escalating interest in and successful use of preemptive predictions as a means of avoiding risk becomes a catalyst for various new forms of social preemption. More and more, governments, corporations, and individuals will use big data to preempt or forestall activities perceived to generate social risk. Often, this will be done with little or no transparency or accountability. Some loan companies, for example, are beginning to use algorithms to determine interest rates for clients with little to no credit history, and to decide who is at high risk for default. Thousands of indicators are analyzed, ranging from the presence of financially secure friends on Facebook to time spent on websites and apps installed on various data devices. Governments, in the meantime, are using this technique in a variety of fields in order to determine the distribution of scarce resources such as social workers for at-risk youth or entitlement to Medicaid, food stamps, and welfare compensation.
Of course, the preemption strategy comes at a significant social cost. As an illustration, consider the practice of using predictive algorithms to generate no-fly lists. Before the development of many such lists in various countries, high-risk individuals were generally at liberty to travel—unless the government had a sufficient reason to believe that such individuals were in the process of committing an offense. In addition to curtailing liberty, a no-fly list that employs predictive algorithms preempts the need for any evidence or constitutional safeguards. Prediction simply replaces the need for proof.
Taken to its logical extreme, the preemption philosophy is not merely proactive—it is aggressive."
Prediction, Preemption, Presumption; How Big Data Threatens Big Picture Privacy. By Ian Kerr and Jessica Earle.
Seminal reading of Preemptive Prediction algorithms and big data.
Thanks to smartphones or Google Glass, we can now be pinged whenever we are about to do something stupid, unhealthy, or unsound. We wouldn’t necessarily need to know why the action would be wrong: the system’s algorithms do the moral calculus on their own. Citizens take on the role of information machines that feed the techno-bureaucratic complex with our data. And why wouldn’t we, if we are promised slimmer waistlines, cleaner air, or longer (and safer) lives in return?
On issues like obesity or climate change—where the policy makers are quick to add that we are facing a ticking-bomb scenario—they will say a little deficit of democracy can go a long way.
Here’s what that deficit would look like: the new digital infrastructure, thriving as it does on real-time data contributed by citizens, allows the technocrats to take politics, with all its noise, friction, and discontent, out of the political process. It replaces the messy stuff of coalition-building, bargaining, and deliberation with the cleanliness and efficiency of data-powered administration.
This phenomenon has a meme-friendly name: “algorithmic regulation,” as Silicon Valley publisher Tim O’Reilly calls it. In essence, information-rich democracies have reached a point where they want to try to solve public problems without having to explain or justify themselves to citizens. Instead, they can simply appeal to our own self-interest—and they know enough about us to engineer a perfect, highly personalized, irresistible nudge."
―Evgeny Morozov - The real privacy problem.
*Spanish-speaking Twitter-bots alleging that a Mexican reporter was NOT killed by the narcotics mob
Mark Zuckerberg dreams of a day when Facebook’s computers would know you and your habits so well that it would deliver exactly the information you want to see — what he calls “the best personalized newspaper in the world.” …
Sites that rely heavily on viral content get about 80 percent of their traffic from Facebook, said Edward Kim, the chief executive of the social media tracking service SimpleReach. More traditional news sites get about 50 percent from social media. Since the [Facebook newsfeed] algorithm has changed, Mr. Kim said, some viral sites have seen a precipitous decline in their traffic. Given Facebook’s clout, the news organizations that have come to rely on the company for large quantities of their traffic are trying to tailor content to appeal to its mysterious algorithms."
― NY Times
That’s the premise driving a new startup called Eterni.me, which emerged this week out of MIT’s Entrepreneurship Development Program. Its goal, according to the startup’s website, is to emulate your personality by tapping into your digital paper trail—chat logs, emails, and the like. Once that information is provided, an algorithm splices together all those you-isms to build an artificial intelligence based on your personality, which “can interact with and offer information and advice to your family and friends after you pass away.” (via Eterni.me Wants To Let You Skype Your Family After You’re Dead | Fast Company | Business Innovation)
Paul Lansky is an algorithmic composer producing works since the seventies. His Idle Chatter pieces from the Homebrew album employ Linear Predictive Coding, granular synthesis and his own algorithms to produce “a kind of mathematical complexity, there are tons of things going on and you don’t really know what the main voice is.” (Lansky, Interview 97). An ontological study revealed that:
Lansky considers software writing, or ‘instrument-building’ as he calls it, integral to the composing process, so stopping to code a new feature or algorithm is not considered an interruption to the composing process. Although he provides the applications (Cmix, RT, GQ, EIN) to anyone who wants to use them, via the Princeton Sound Kitchen web site, he considers the algorithms (Cmix scripts) of his to be integral to the composition and does not distribute those. In this way they can be considered as partial scores describing the works or more usually sections of the work and specific timbral manipulations.
Dating Site Bots - via Wired
Mathematician Chris McKinlay used algorithms to outsmart the algorithmic dating site and find a date, resulting in countless date requests coming at him.
McKinlay first automated twelve OkCupid accounts using a python script to scrape as much information about women as possible into a database. The bots would randomly answer questions, generating matches that revealed otherwise hidden from view information about others. His bots simulated human timing to avoid getting caught by okCupid filters. After three weeks he’d harvested 6 million questions and answers from 20,000 women all over the country.
Next he analysed the data by applying an algorithm called K-Modes, developed by Bell Labs to study diseased soybean crops. The algorithm clumps data into categories, and for the scraped okCupid data it identified seven statistically distinct groups. Each group tends to answer questions in similar ways, for instance one group tends to strongly believe in God, so he named it ‘God’. Another group tends to be older, yet adventuresome, he named it ‘Samantha’. The groups identified were: Dog, Green, Mindful, Tattoo, God, Samantha and Diverse.
His preferred groups were Dog and Tattoo, mostly made up of creative professionals and young arty types respectively. To then promote himself to these groups he text-mined their data and preferred questions, to tailor two profiles of himself that closely matched, but claims he was still honest about himself, just selective and targeted. However okCupid has a question rating system so that people are also matched on how important they think a question is. To get the ratings right McKinlay used a machine learning algorithm called Adaptive Boosting to derive and apply the best weightings.
Next automation trick - getting noticed. OkCupid members are notified when someone has checked out their account, so McKinlay wrote a script to visit thousands of profiles on his behalf, making sure to cover the entirety of the groups. The method was effective proving that people would then visit his profile in response, averaging 400 profile views a day.
Then messages and dating requests began rolling in. He went on countless dates, working his way through his ‘queue’ for over a month, recording every date in a logbook. On date number 88 he found a partner that stuck and revealed the entire experiment with her, luckily for him she found it amusing.
"I think that what I did is just a slightly more algorithmic, large-scale, and machine-learning-based version of what everyone does on the site,” says McKinlay, who now promotes a book about his methods on amazon and his website.
The Internet of Everything (as suggested by Cisco) - viafuturescope
Design fiction / future speculation short with plenty of automated systems from self-driving cars to agriculture. Its a vision forecasting the future for the sort of man who has a highly paid job at Google, I’m not sure how the other 99% fit in, or where their cars and homes have gone.