A_tune_in/a_tune_in.ipynb

324 lines
17 KiB
Plaintext
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

{
"cells": [
{
"cell_type": "markdown",
"id": "d4e131d0",
"metadata": {},
"source": [
"# * .* A tune in **. . * \n",
"\n",
"A workshop by [Varia](http://varia.zone/en/).\n",
"\n",
"We are now on a Jupyter Notebook. The code you are running is downloaded on one of our computers, and you are accessing it from a local server. To run the code, select the code cells in order and press \"Run\" in the toolbar above.\n",
"\n",
"\n",
"## Pincelate\n",
"\n",
"[Pincelate](https://pypi.org/project/pincelate/) is a machine learning model for spelling English words and sounding them out developed by artist [Allison Parrish](https://www.decontextualize.com/). It relies on particular versions of the Google-funded pre-trained machine learning models in [Tensorflow](https://www.tensorflow.org/) and the deep-learning API [Keras](https://keras.io/).\n",
"\n",
"\n",
"### Installing Pincelate"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "ranking-bottom",
"metadata": {
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[31mERROR: Could not find a version that satisfies the requirement tensorflow==1.15.5\u001b[0m\n",
"\u001b[31mERROR: No matching distribution found for tensorflow==1.15.5\u001b[0m\n",
"Requirement already satisfied: keras==2.2.5 in /home/ccl/.local/lib/python3.9/site-packages (2.2.5)\n",
"Requirement already satisfied: keras-preprocessing>=1.1.0 in /home/ccl/.local/lib/python3.9/site-packages (from keras==2.2.5) (1.1.2)\n",
"Requirement already satisfied: pyyaml in /home/ccl/.local/lib/python3.9/site-packages (from keras==2.2.5) (6.0)\n",
"Requirement already satisfied: scipy>=0.14 in /home/ccl/.local/lib/python3.9/site-packages (from keras==2.2.5) (1.9.3)\n",
"Requirement already satisfied: numpy>=1.9.1 in /home/ccl/.local/lib/python3.9/site-packages (from keras==2.2.5) (1.23.4)\n",
"Requirement already satisfied: six>=1.9.0 in /home/ccl/.local/lib/python3.9/site-packages (from keras==2.2.5) (1.16.0)\n",
"Requirement already satisfied: keras-applications>=1.0.8 in /home/ccl/.local/lib/python3.9/site-packages (from keras==2.2.5) (1.0.8)\n",
"Requirement already satisfied: h5py in /home/ccl/.local/lib/python3.9/site-packages (from keras==2.2.5) (3.7.0)\n",
"Requirement already satisfied: pincelate in /home/ccl/.local/lib/python3.9/site-packages (0.0.1)\n",
"Requirement already satisfied: pronouncing>=0.2.0 in /home/ccl/.local/lib/python3.9/site-packages (from pincelate) (0.2.0)\n",
"Requirement already satisfied: Keras>=2.2.0 in /home/ccl/.local/lib/python3.9/site-packages (from pincelate) (2.2.5)\n",
"Requirement already satisfied: scikit-learn>=0.20.0 in /home/ccl/.local/lib/python3.9/site-packages (from pincelate) (1.1.3)\n",
"Requirement already satisfied: numpy>=1.9.1 in /home/ccl/.local/lib/python3.9/site-packages (from Keras>=2.2.0->pincelate) (1.23.4)\n",
"Requirement already satisfied: keras-applications>=1.0.8 in /home/ccl/.local/lib/python3.9/site-packages (from Keras>=2.2.0->pincelate) (1.0.8)\n",
"Requirement already satisfied: scipy>=0.14 in /home/ccl/.local/lib/python3.9/site-packages (from Keras>=2.2.0->pincelate) (1.9.3)\n",
"Requirement already satisfied: keras-preprocessing>=1.1.0 in /home/ccl/.local/lib/python3.9/site-packages (from Keras>=2.2.0->pincelate) (1.1.2)\n",
"Requirement already satisfied: h5py in /home/ccl/.local/lib/python3.9/site-packages (from Keras>=2.2.0->pincelate) (3.7.0)\n",
"Requirement already satisfied: six>=1.9.0 in /home/ccl/.local/lib/python3.9/site-packages (from Keras>=2.2.0->pincelate) (1.16.0)\n",
"Requirement already satisfied: pyyaml in /home/ccl/.local/lib/python3.9/site-packages (from Keras>=2.2.0->pincelate) (6.0)\n",
"Requirement already satisfied: cmudict>=0.4.0 in /home/ccl/.local/lib/python3.9/site-packages (from pronouncing>=0.2.0->pincelate) (1.0.2)\n",
"Requirement already satisfied: joblib>=1.0.0 in /home/ccl/.local/lib/python3.9/site-packages (from scikit-learn>=0.20.0->pincelate) (1.1.0)\n",
"Requirement already satisfied: threadpoolctl>=2.0.0 in /home/ccl/.local/lib/python3.9/site-packages (from scikit-learn>=0.20.0->pincelate) (3.1.0)\n"
]
}
],
"source": [
"import ipywidgets as widgets\n",
"from IPython.display import display,HTML\n",
"from ipywidgets import interact,interactive_output,Layout,HBox,VBox\n",
"import sys\n",
"!{sys.executable} -m pip install tensorflow==1.15.5\n",
"!{sys.executable} -m pip install keras==2.2.5\n",
"!{sys.executable} -m pip install pincelate"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "polar-modeling",
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Using TensorFlow backend.\n"
]
}
],
"source": [
"from pincelate import Pincelate"
]
},
{
"cell_type": "markdown",
"id": "9f1e2cc7",
"metadata": {},
"source": [
"This is how Parrish describes the *manipulate* function of Pincelate: \"Pincelate actually consists of two models: one that knows how to sound out words based on how they're spelled, and another that knows how to spell words from sounds.\"\n",
"\n",
"With this function, words are passing through the two models, taking words, sounding them out from their spelling and then bringing them back into writing via phonemes. As a way to situate ourselves into the conference, we propose to take the conference titles as material to reshape, contort, or respell. \n",
"\n",
"By respelling, we are not only writing out words again, but also remaking the spells that words cast. \n",
"You can make the temperature higher to \"melt\" the words.\n",
"\n",
"\n",
"**CONFERENCE TITLES**\n",
"\n",
"Please keep Crashing (Workshop Documentation)\n",
"\n",
"The Future has died (Conference Opening)\n",
"\n",
"Race Against the Time Machine (tbc, Workshop)\n",
"\n",
"Decolonizing Digital Archives (Conversation) \n",
"\n",
"AI Between Apocalypse, Lethargy, and Wonder (Panel)\n",
"\n",
"DeColoniality of AI (Panel)\n",
"\n",
"Decolonial Weavings, Vernacular Algorithms\n",
"\n",
"Ecologies of Dreaming Beyond AI\n",
"\n",
"Technopoetics of Sound. Listening Session for Any | One Day\n",
"\n",
"Entangled Media Philosophies of Technology\n",
"\n",
"Predicting Backwards: Generating Histories\n",
"\n",
"Sedimented Temporalities of Geodigital Landscapes"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "muslim-labor",
"metadata": {},
"outputs": [
{
"ename": "NameError",
"evalue": "name 'Pincelate' is not defined",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)",
"Cell \u001b[0;32mIn [1], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m pin \u001b[38;5;241m=\u001b[39m \u001b[43mPincelate\u001b[49m()\n",
"\u001b[0;31mNameError\u001b[0m: name 'Pincelate' is not defined"
]
}
],
"source": [
"pin = Pincelate()\n",
"pin.manipulate(\"Future\",temperature=1.5)"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "b648b5ae",
"metadata": {},
"outputs": [
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "0940444627934c01ba4c2252a25b8568",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"interactive(children=(Text(value='any one future died', description='s'), FloatSlider(value=1.2500000000000002…"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"@interact(s=\"any one future died\",temp=(0.05,2.5,0.05))\n",
"def tempadjust(s,temp):\n",
" return ' '.join([pin.manipulate(w.lower(),temperature=temp) for w in s.split()])"
]
},
{
"cell_type": "markdown",
"id": "573dbac9",
"metadata": {},
"source": [
"## Difference of dictionaries\n",
"\n",
"> It is often forgotten that dictionaries are artificial repositories, put together well after the languages they define. The roots of language are irrational and of a magical nature.” Jorge Luis Borges, El otro, el mismo. (Buenos Aires: Emecé, 2005), 5.\n",
"\n",
"Starting from the many worlds and layers that words carry with them, with this exercise, we propose to start the conference by sharing meanings, associations and (mis)understandings we have around them with each other.\n",
"\n",
"The code below is borrowed from a [temporary configuration](http://varia.zone/en/wordmord-dear-language.html) of [WordMord](https://wordmord-ur.la/), which came together around a *para-dictionary* that used to bend a text in different directions. \n",
"\n",
"The code is not using machine learning models, in face it is a very basic script that proposes to start with three kinds of meaning making processes:\n",
"\n",
"* paranyms - words having nearly the same meaning a feeling about a word that is beyond, below, behind formalised definitions. Closely resembling the term but not the same.\n",
"\n",
"* parameanings - could be our understanding of terms that run parallel to the formal definition It could be a 'paranormal' understanding - one outside our known comprehensions.\n",
"\n",
"* paradoxes - in this conference, paradoxes are \"as a condition of existence that has the potential to shake the common sense of AI\". What are paradoxes that surround specific terms?\n",
"\n",
"Before running the code, please edit the collective dictionary here [https://pad.vvvvvvaria.org/any_one_day_this_dictionary_has_died](https://pad.vvvvvvaria.org/any_one_day_this_dictionary_has_died). Then, choose a sentence/expression/title to respell and run the following cell."
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "189c5b06",
"metadata": {},
"outputs": [
{
"ename": "KeyboardInterrupt",
"evalue": "Interrupted by user",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)",
"\u001b[0;32m<ipython-input-2-221c0e04c634>\u001b[0m in \u001b[0;36m<module>\u001b[0;34m\u001b[0m\n\u001b[1;32m 12\u001b[0m \u001b[0;31m#print(wordmord['death']['para-etymology'][0])\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 13\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 14\u001b[0;31m \u001b[0msentence\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minput\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'give me your words / δώσε μου κείμενο: '\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 15\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 16\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mmakedemonic\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m~/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/ipykernel/kernelbase.py\u001b[0m in \u001b[0;36mraw_input\u001b[0;34m(self, prompt)\u001b[0m\n\u001b[1;32m 852\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_parent_ident\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 853\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_parent_header\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 854\u001b[0;31m \u001b[0mpassword\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mFalse\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 855\u001b[0m )\n\u001b[1;32m 856\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;32m~/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/ipykernel/kernelbase.py\u001b[0m in \u001b[0;36m_input_request\u001b[0;34m(self, prompt, ident, parent, password)\u001b[0m\n\u001b[1;32m 893\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mKeyboardInterrupt\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 894\u001b[0m \u001b[0;31m# re-raise KeyboardInterrupt, to truncate traceback\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 895\u001b[0;31m \u001b[0;32mraise\u001b[0m \u001b[0mKeyboardInterrupt\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"Interrupted by user\"\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfrom\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 896\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mException\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0me\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 897\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlog\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mwarning\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"Invalid Message:\"\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mexc_info\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mTrue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
"\u001b[0;31mKeyboardInterrupt\u001b[0m: Interrupted by user"
]
}
],
"source": [
"import random\n",
"import time\n",
"import json\n",
"import wget\n",
"\n",
"url = \"https://pad.vvvvvvaria.org/any_one_day_this_dictionary_has_died/export/txt\"\n",
"wget.download(url, 'any_one_day_this_dictionary_has_died.json')\n",
" \n",
"with open('any_one_day_this_dictionary_has_died.json', 'r') as f:\n",
" wordmord = json.loads(f.read().replace(\"'\", '\"'))\n",
"\n",
"#print(wordmord['death']['para-etymology'][0])\n",
"\n",
"sentence = input('give me your words / δώσε μου κείμενο: ')\n",
"\n",
"def makedemonic():\n",
"\tnew_sentence = sentence\n",
"\tfor word in wordmord:\n",
"\t\tif word in new_sentence:\n",
"\t\t\tnew_sentence = new_sentence.replace(word, random.choice(wordmord[word]['paramyms']))\n",
"\tprint(new_sentence)\n",
"\n",
"def makepara():\n",
"\tnew_sentence = sentence\n",
"\tfor word in wordmord:\n",
"\t\tif word in new_sentence:\n",
"\t\t\tnew_sentence = new_sentence.replace(word, random.choice(wordmord[word]['parameanings']))\n",
"\tprint(new_sentence)\n",
"\n",
"def makepira():\n",
"\tnew_sentence = sentence\n",
"\tfor word in wordmord:\n",
"\t\tif word in new_sentence:\n",
"\t\t\tnew_sentence = new_sentence.replace(word, random.choice(wordmord[word]['paradoxes']))\n",
"\tprint(new_sentence)\n",
"\n",
"\n",
"type = input('choose type of transformation / τύπος μετάλλαξης: ')\n",
"\n",
"if type == 'demonic':\n",
"\tmakedemonic()\n",
"elif type == 'para':\n",
"\tmakepara()\n",
"elif type == 'pira':\n",
"\tmakepira()\n",
"else:\n",
"\tmakedemonic()\n",
"\ttime.sleep(1)\n",
"\tmakepara()\n",
"\ttime.sleep(1)\n",
"\tmakepira()\n",
"\n"
]
},
{
"cell_type": "markdown",
"id": "28782db3",
"metadata": {},
"source": [
"## Chant with, for, against, around, about AI\n",
"\n",
"By chanting, we refer to a repeated rhythmic phrase, one sung loudly and in unison with a group.\n",
"Chants are used in spiritual practices, protests, football matches, demonstrations, battles. They are words travelling surfacing through our bodies out into the world.\n",
"In this final part, we will make our own chant with, for, against, around, about AI that we can keep repeating throughout the conference.\n",
"\n",
"In groups of 4, and using the Para-generator, take a few minutes to make a chant to share with everyone else. A tune out."
]
},
{
"cell_type": "markdown",
"id": "f597b743",
"metadata": {},
"source": [
"You can use this pad [https://pad.vvvvvvaria.org/any_one_day_this_chant_will_live](https://pad.vvvvvvaria.org/any_one_day_this_chant_will_live) to draft your chants."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.2"
}
},
"nbformat": 4,
"nbformat_minor": 5
}