Multi-headed Machines

In preparation for today’s session we were asked to watch an interview where Sam Altman (CEO, OpenAI) talks about Artificial Intelligence at Big Compute 20 Tech Conference. OpenAI want to figure out how human intelligence works and are attempting to build AGI ‘safe and beneficial’ (Artificial General Intelligence) for comercial gain. Altman proposed that we are in the arly stages of the ‘AI revolution’ , and predicted that there will soon be ‘an explotion of systems that can really process, understand, interact and generate language, you’ll be able to have dialigue with a machine that really makes sense.’ He also stressed that importance has shifted from Big Data to Big Compute, i.e. the more computing power you have is more powerful than the amount of data.

‘We have an algorithm that can learn and it seems to keep scaling the more computing power you have.’ SA

One of a few worrying statements from Altman: ‘There will be a few people who can train large models for neural networks and its up to those companies figure out how to share that power’ SA

HOW MACHINES READ

Hasan took us through an indepth lecture and discussion around the workings behind and geneology of Natural Language Processing, firstly explaining the difference between Supervised (learns with human assistance) and Unsupervised Learning (learns on it’s own from given data). Hasan explained that there are alot of complex high dimension mathematics behind unsupervised ML and how this can be used to augment human intelligence, referencing this paper:

We then went on to discuss the MNIST dataset, Google Deep Dream and how AI for language is a completely different problem to image based AI systems. The problem with human language is that you need alot of context to make sense of it.

‘Code is a formal syntax for thought’ Hasan S

Hasan stated that human language is not formal as you can change the meaning with very slight changes, as you can see in this example:

General NN’s won’t work when dealing with language, RNN’s (Recursive Neural Networks) are used instead. A set back of using RNN’s is that you have to limit the amount of data, otherwise the model will be very complex and run slower and slower. This results with some loss of data. To remedy this LSTM (Long Short Term Memory) model architectures were developed, where one piplinne is continuously running throughout the whole NN, filtering out the important bits.

The next evolution in these architectures was the Transformer model. ‘Attention’ is the focus of this technology. ‘Multi-headed attention’ figures out the most interesting (relevant) parts of the input data, whilst the ‘Masked Multi-headed Attention has all the knowledge of language gathered from text from the entire internet and finds the most interesting part of the input data based on that. OpenAI is based on the Transformer architecture, this kind of technology scales up really well.

Hasan took us through a demo of OpenAI’s GTP3 training model, which has 175billion parameters and with it they can almost represent human language. The hardware to train this model cost 12million dolars.

YOU + POEM + AI & ML =

In the second half of the day we were asked to think of a poet that represents our ‘home ground’. I chose a couple of contemporary poets who are based in Scotland, whose poems resonated with me somewhat. Here are links to the thre poems I selected for the task:

I fed small sections from these poems into the GTP3 (Generative Pre-trained Transformer 3):

Premise: A darker history of Glasgow
Poem: Glasgow, late September and the city I spoke of
in another country (its fierce sandstone
burning, its bombast of finials built on the heads of slaves
the monumental tower blocks of a later order
catching fire from the west as a plane comes in)

Premise: environmental concerns
Poem: Earth pushes up
through our footsoles
air spins us
in its thinning spheres
time flies through us

Premise: The world is sick
Poem: The world remembers how it got here.
And with that memory comes a sharp pain in the left temple of the world
Instinct compels it to raise a hand to the skull to nurse the wound,
But neurons fire to dead limbs.
The world cannot move

…and the is what it responded with:

Premise: English youth
Poem: Everyone was trying to flog us
Coke, acid, trips to Ibiza, saying
our club’s shit and it’s too packed now
We didn’t need any of it.

Premise: The Internet
Poem: We are the things we share
We are the things that aren’t ours
We are the things we like
And the things we don’t like

I was placed in a group with Upsana Nattoji Roy and Shayekh Mohammad Arif and we combined our poems as the input data for the GTP3:

Selected output:

Premise: balancing act
Poem: Big brother is watching you! Big brother is watching me!
So, we better step easily (with guns), walk past the drones silently (wearing bags)…
What the future brings is what, the leaders will hatch!
Their big ideas must be careful; rigged to protect the elite!

I think this is fairly apt when thinking of some of Sam Altman’s statements.

HUMAN_NONHUMAN_ALGORITHMIC_GESTURES

I will be working with Upsana and Shayekh toward a collaborative artwork using some of the tools we have been experimenting with this week. We began to map out our common interests and potential direction for the projects:

Leave a Reply

Your email address will not be published. Required fields are marked *