Repo for A Tune In workshop by amy, Aggeliki and Cristina for the Any One Day The Future Died conference.
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

359 lines
20 KiB

{
"cells": [
{
"cell_type": "markdown",
"id": "d4e131d0",
"metadata": {},
"source": [
"# * .* A tune in **. . * \n",
"\n",
"A workshop by [Varia](http://varia.zone/en/).\n",
"\n",
"We are now on a Jupyter Notebook. The code you are running is downloaded on one of our computers, and you are accessing it from a local server. To run the code, select the code cells in order and press \"Run\" in the toolbar above.\n",
"\n",
"\n",
"## Pincelate\n",
"\n",
"[Pincelate](https://pypi.org/project/pincelate/) is a machine learning model for spelling English words and sounding them out developed by artist [Allison Parrish](https://www.decontextualize.com/). It relies on particular versions of the Google-funded pre-trained machine learning models in [Tensorflow](https://www.tensorflow.org/) and the deep-learning API [Keras](https://keras.io/).\n",
"\n",
"\n",
"### Installing Pincelate"
]
},
{
"cell_type": "code",
2 years ago
"execution_count": null,
"id": "ranking-bottom",
"metadata": {
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Requirement already satisfied: tensorflow==1.15.5 in ./pocket/lib/python3.6/site-packages (1.15.5)\n",
2 years ago
"Requirement already satisfied: six>=1.10.0 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (1.16.0)\n",
"Requirement already satisfied: wrapt>=1.11.1 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (1.14.1)\n",
2 years ago
"Requirement already satisfied: astor>=0.6.0 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (0.8.1)\n",
"Requirement already satisfied: h5py<=2.10.0 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (2.10.0)\n",
"Requirement already satisfied: grpcio>=1.8.6 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (1.48.2)\n",
2 years ago
"Requirement already satisfied: wheel>=0.26 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (0.37.1)\n",
"Requirement already satisfied: keras-preprocessing>=1.0.5 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (1.1.2)\n",
"Requirement already satisfied: protobuf>=3.6.1 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (3.19.6)\n",
2 years ago
"Requirement already satisfied: gast==0.2.2 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (0.2.2)\n",
"Requirement already satisfied: keras-applications>=1.0.8 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (1.0.8)\n",
"Requirement already satisfied: numpy<1.19.0,>=1.16.0 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (1.18.5)\n",
2 years ago
"Requirement already satisfied: tensorflow-estimator==1.15.1 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (1.15.1)\n",
"Requirement already satisfied: absl-py>=0.7.0 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (1.3.0)\n",
"Requirement already satisfied: google-pasta>=0.1.6 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (0.2.0)\n",
"Requirement already satisfied: opt-einsum>=2.3.2 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (3.3.0)\n",
"Requirement already satisfied: tensorboard<1.16.0,>=1.15.0 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (1.15.0)\n",
2 years ago
"Requirement already satisfied: termcolor>=1.1.0 in ./pocket/lib/python3.6/site-packages (from tensorflow==1.15.5) (1.1.0)\n",
"Requirement already satisfied: markdown>=2.6.8 in ./pocket/lib/python3.6/site-packages (from tensorboard<1.16.0,>=1.15.0->tensorflow==1.15.5) (3.3.7)\n",
"Requirement already satisfied: werkzeug>=0.11.15 in ./pocket/lib/python3.6/site-packages (from tensorboard<1.16.0,>=1.15.0->tensorflow==1.15.5) (2.0.3)\n",
2 years ago
"Requirement already satisfied: setuptools>=41.0.0 in ./pocket/lib/python3.6/site-packages (from tensorboard<1.16.0,>=1.15.0->tensorflow==1.15.5) (59.6.0)\n",
"Requirement already satisfied: importlib-metadata>=4.4 in ./pocket/lib/python3.6/site-packages (from markdown>=2.6.8->tensorboard<1.16.0,>=1.15.0->tensorflow==1.15.5) (4.8.3)\n",
"Requirement already satisfied: dataclasses in ./pocket/lib/python3.6/site-packages (from werkzeug>=0.11.15->tensorboard<1.16.0,>=1.15.0->tensorflow==1.15.5) (0.8)\n",
"Requirement already satisfied: typing-extensions>=3.6.4 in ./pocket/lib/python3.6/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<1.16.0,>=1.15.0->tensorflow==1.15.5) (4.1.1)\n",
2 years ago
"Requirement already satisfied: zipp>=0.5 in ./pocket/lib/python3.6/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<1.16.0,>=1.15.0->tensorflow==1.15.5) (3.6.0)\n",
"Requirement already satisfied: keras==2.2.5 in ./pocket/lib/python3.6/site-packages (2.2.5)\n",
"Requirement already satisfied: scipy>=0.14 in ./pocket/lib/python3.6/site-packages (from keras==2.2.5) (1.5.4)\n",
2 years ago
"Requirement already satisfied: numpy>=1.9.1 in ./pocket/lib/python3.6/site-packages (from keras==2.2.5) (1.18.5)\n",
"Requirement already satisfied: six>=1.9.0 in ./pocket/lib/python3.6/site-packages (from keras==2.2.5) (1.16.0)\n",
"Requirement already satisfied: keras-applications>=1.0.8 in ./pocket/lib/python3.6/site-packages (from keras==2.2.5) (1.0.8)\n",
"Requirement already satisfied: keras-preprocessing>=1.1.0 in ./pocket/lib/python3.6/site-packages (from keras==2.2.5) (1.1.2)\n",
2 years ago
"Requirement already satisfied: h5py in ./pocket/lib/python3.6/site-packages (from keras==2.2.5) (2.10.0)\n",
"Requirement already satisfied: pyyaml in ./pocket/lib/python3.6/site-packages (from keras==2.2.5) (6.0)\n",
"Requirement already satisfied: pincelate in ./pocket/lib/python3.6/site-packages (0.0.1)\n",
"Requirement already satisfied: pronouncing>=0.2.0 in ./pocket/lib/python3.6/site-packages (from pincelate) (0.2.0)\n",
2 years ago
"Requirement already satisfied: Keras>=2.2.0 in ./pocket/lib/python3.6/site-packages (from pincelate) (2.2.5)\n",
"Requirement already satisfied: scikit-learn>=0.20.0 in ./pocket/lib/python3.6/site-packages (from pincelate) (0.24.2)\n",
2 years ago
"Requirement already satisfied: keras-applications>=1.0.8 in ./pocket/lib/python3.6/site-packages (from Keras>=2.2.0->pincelate) (1.0.8)\n",
"Requirement already satisfied: numpy>=1.9.1 in ./pocket/lib/python3.6/site-packages (from Keras>=2.2.0->pincelate) (1.18.5)\n",
"Requirement already satisfied: six>=1.9.0 in ./pocket/lib/python3.6/site-packages (from Keras>=2.2.0->pincelate) (1.16.0)\n",
"Requirement already satisfied: pyyaml in ./pocket/lib/python3.6/site-packages (from Keras>=2.2.0->pincelate) (6.0)\n",
2 years ago
"Requirement already satisfied: scipy>=0.14 in ./pocket/lib/python3.6/site-packages (from Keras>=2.2.0->pincelate) (1.5.4)\n",
"Requirement already satisfied: keras-preprocessing>=1.1.0 in ./pocket/lib/python3.6/site-packages (from Keras>=2.2.0->pincelate) (1.1.2)\n",
"Requirement already satisfied: h5py in ./pocket/lib/python3.6/site-packages (from Keras>=2.2.0->pincelate) (2.10.0)\n",
"Requirement already satisfied: cmudict>=0.4.0 in ./pocket/lib/python3.6/site-packages (from pronouncing>=0.2.0->pincelate) (1.0.2)\n",
2 years ago
"Requirement already satisfied: joblib>=0.11 in ./pocket/lib/python3.6/site-packages (from scikit-learn>=0.20.0->pincelate) (1.1.1)\n",
"Requirement already satisfied: threadpoolctl>=2.0.0 in ./pocket/lib/python3.6/site-packages (from scikit-learn>=0.20.0->pincelate) (3.1.0)\n"
]
}
],
"source": [
"import ipywidgets as widgets\n",
"from IPython.display import display,HTML\n",
"from ipywidgets import interact,interactive_output,Layout,HBox,VBox\n",
"import sys\n",
"!{sys.executable} -m pip install tensorflow==1.15.5\n",
"!{sys.executable} -m pip install keras==2.2.5\n",
2 years ago
"!{sys.executable} -m pip install pincelate\n",
"from pincelate import Pincelate\n",
"pin = Pincelate()"
]
},
{
"cell_type": "markdown",
"id": "9f1e2cc7",
"metadata": {},
"source": [
"This is how Parrish describes the *manipulate* function of Pincelate: \"Pincelate actually consists of two models: one that knows how to sound out words based on how they're spelled, and another that knows how to spell words from sounds.\"\n",
"\n",
"With this function, words are passing through the two models, taking words, sounding them out from their spelling and then bringing them back into writing via phonemes. As a way to situate ourselves into the conference, we propose to take the conference titles as material to reshape, contort, or respell. \n",
"\n",
"By respelling, we are not only writing out words again, but also remaking the spells that words cast. \n",
"You can make the temperature higher to \"melt\" the words.\n",
"\n",
"\n",
"**CONFERENCE TITLES**\n",
"\n",
"Please keep Crashing (Workshop Documentation)\n",
"\n",
"The Future has died (Conference Opening)\n",
"\n",
"Race Against the Time Machine (tbc, Workshop)\n",
"\n",
"Decolonizing Digital Archives (Conversation) \n",
"\n",
"AI – Between Apocalypse, Lethargy, and Wonder (Panel)\n",
"\n",
"DeColoniality of AI (Panel)\n",
"\n",
"Decolonial Weavings, Vernacular Algorithms\n",
"\n",
"Ecologies of Dreaming Beyond AI\n",
"\n",
"Technopoetics of Sound. Listening Session for Any | One Day\n",
"\n",
"Entangled Media Philosophies of Technology\n",
"\n",
"Predicting Backwards: Generating Histories\n",
"\n",
"Sedimented Temporalities of Geodigital Landscapes"
]
},
{
"cell_type": "code",
2 years ago
"execution_count": 3,
"id": "muslim-labor",
"metadata": {},
2 years ago
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"WARNING:tensorflow:From /home/angeliki/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:541: The name tf.placeholder is deprecated. Please use tf.compat.v1.placeholder instead.\n",
"\n",
"WARNING:tensorflow:From /home/angeliki/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:66: The name tf.get_default_graph is deprecated. Please use tf.compat.v1.get_default_graph instead.\n",
"\n",
"WARNING:tensorflow:From /home/angeliki/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:4432: The name tf.random_uniform is deprecated. Please use tf.random.uniform instead.\n",
"\n",
"WARNING:tensorflow:From /home/angeliki/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:148: The name tf.placeholder_with_default is deprecated. Please use tf.compat.v1.placeholder_with_default instead.\n",
"\n",
"WARNING:tensorflow:From /home/angeliki/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:3733: calling dropout (from tensorflow.python.ops.nn_ops) with keep_prob is deprecated and will be removed in a future version.\n",
"Instructions for updating:\n",
"Please use `rate` instead of `keep_prob`. Rate should be set to `rate = 1 - keep_prob`.\n",
"WARNING:tensorflow:From /home/angeliki/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:3239: where (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.\n",
"Instructions for updating:\n",
"Use tf.where in 2.0, which has the same broadcast rule as np.where\n",
"WARNING:tensorflow:From /home/angeliki/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:190: The name tf.get_default_session is deprecated. Please use tf.compat.v1.get_default_session instead.\n",
"\n",
"WARNING:tensorflow:From /home/angeliki/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:197: The name tf.ConfigProto is deprecated. Please use tf.compat.v1.ConfigProto instead.\n",
"\n",
"WARNING:tensorflow:From /home/angeliki/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:203: The name tf.Session is deprecated. Please use tf.compat.v1.Session instead.\n",
"\n",
"WARNING:tensorflow:From /home/angeliki/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:207: The name tf.global_variables is deprecated. Please use tf.compat.v1.global_variables instead.\n",
"\n",
"WARNING:tensorflow:From /home/angeliki/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:216: The name tf.is_variable_initialized is deprecated. Please use tf.compat.v1.is_variable_initialized instead.\n",
"\n",
"WARNING:tensorflow:From /home/angeliki/Documents/Any_One_day_future/A_tune_in/pocket/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:223: The name tf.variables_initializer is deprecated. Please use tf.compat.v1.variables_initializer instead.\n",
"\n"
]
},
{
"data": {
"text/plain": [
"'icr.ckung'"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
2 years ago
"pin.manipulate(\"crashing\",temperature=3)"
]
},
{
"cell_type": "code",
2 years ago
"execution_count": 4,
"id": "b648b5ae",
"metadata": {},
2 years ago
"outputs": [
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "61490633621a4f568b855628db6d69d8",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"interactive(children=(Text(value='any one future died', description='s'), FloatSlider(value=1.2500000000000002…"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"@interact(s=\"any one future died\",temp=(0.05,2.5,0.05))\n",
"def tempadjust(s,temp):\n",
" return ' '.join([pin.manipulate(w.lower(),temperature=temp) for w in s.split()])"
]
},
{
"cell_type": "markdown",
"id": "573dbac9",
"metadata": {},
"source": [
"## Difference of dictionaries\n",
"\n",
"> It is often forgotten that dictionaries are artificial repositories, put together well after the languages they define. The roots of language are irrational and of a magical nature.” Jorge Luis Borges, El otro, el mismo. (Buenos Aires: Emecé, 2005), 5.\n",
"\n",
"Starting from the many worlds and layers that words carry with them, with this exercise, we propose to start the conference by sharing meanings, associations and (mis)understandings we have around them with each other.\n",
"\n",
"The code below is borrowed from a [temporary configuration](http://varia.zone/en/wordmord-dear-language.html) of [WordMord](https://wordmord-ur.la/), which came together around a *para-dictionary* that used to bend a text in different directions. \n",
"\n",
"The code is not using machine learning models, in face it is a very basic script that proposes to start with three kinds of meaning making processes:\n",
"\n",
"* paranyms - words having nearly the same meaning a feeling about a word that is beyond, below, behind formalised definitions. Closely resembling the term but not the same.\n",
"\n",
"* parameanings - could be our understanding of terms that run parallel to the formal definition It could be a 'paranormal' understanding - one outside our known comprehensions.\n",
"\n",
"* paradoxes - in this conference, paradoxes are \"as a condition of existence that has the potential to shake the common sense of AI\". What are paradoxes that surround specific terms?\n",
"\n",
"Before running the code, please edit the collective dictionary here [https://pad.vvvvvvaria.org/any_one_day_this_dictionary_has_died](https://pad.vvvvvvaria.org/any_one_day_this_dictionary_has_died). Then, choose a sentence/expression/title to respell and run the following cell."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "189c5b06",
"metadata": {},
"outputs": [],
"source": [
"import random\n",
"import time\n",
"import json\n",
"import wget\n",
"import os\n",
"\n",
"url = \"https://pad.vvvvvvaria.org/any_one_day_this_dictionary_has_died/export/txt\"\n",
"\n",
"path= \"/home/angeliki/Documents/Any_One_day_future/A_tune_in\"\n",
"\n",
"\n",
"filename = path + '/' + os.path.basename(url) # get the full path of the file\n",
"if os.path.exists(filename):\n",
" os.remove(filename) # if exist, remove it directly\n",
"wget.download(url, out=filename) # download it to the specific path.\n",
"\n",
" \n",
"with open(filename, 'r') as f:\n",
" dictionary = json.loads(f.read().replace(\"'\", '\"'))\n",
"\n",
"\n",
"\n",
"sentence = input('give me your words: ')\n",
"\n",
"\n",
"def makedemonic():\n",
"\tnew_sentence = sentence\n",
"\tfor word in dictionary:\n",
"\t\tif word in new_sentence:\n",
"\t\t\tnew_sentence = new_sentence.replace(word, random.choice(dictionary[word]['paranyms']))\n",
"\tprint(new_sentence)\n",
"\n",
"def makepara():\n",
"\tnew_sentence = sentence\n",
"\tfor word in dictionary:\n",
"\t\tif word in new_sentence:\n",
"\t\t\tnew_sentence = new_sentence.replace(word, random.choice(dictionary[word]['parameanings']))\n",
"\tprint(new_sentence)\n",
"\n",
"def makepira():\n",
"\tnew_sentence = sentence\n",
"\tfor word in dictionary:\n",
"\t\tif word in new_sentence:\n",
"\t\t\tnew_sentence = new_sentence.replace(word, random.choice(dictionary[word]['paradoxes']))\n",
"\tprint(new_sentence)\n",
"\n",
"\n",
"type = input('choose replacing method between \"paranyms\",\"parameanings\",\"paradoxes\",\"all\": ')\n",
"\n",
"if type == 'paranyms':\n",
"\tmakedemonic()\n",
"elif type == 'parameanings':\n",
"\tmakepara()\n",
"elif type == 'paradoxes':\n",
"\tmakepira()\n",
"elif type =='all':\n",
"\tmakedemonic()\n",
"\ttime.sleep(1)\n",
"\tmakepara()\n",
"\ttime.sleep(1)\n",
"\tmakepira()\n",
"else:\n",
" print(\"Wrong input. Please run the code again\")\n",
"\n"
]
},
{
"cell_type": "markdown",
"id": "28782db3",
"metadata": {},
"source": [
"## Chant with, for, against, around, about AI\n",
"\n",
"By chanting, we refer to a repeated rhythmic phrase, one sung loudly and in unison with a group.\n",
"Chants are used in spiritual practices, protests, football matches, demonstrations, battles. They are words travelling surfacing through our bodies out into the world.\n",
"In this final part, we will make our own chant with, for, against, around, about AI that we can keep repeating throughout the conference.\n",
"\n",
"In groups of 4, and using the Para-generator, take a few minutes to make a chant to share with everyone else. A tune out."
]
},
{
"cell_type": "markdown",
"id": "f597b743",
"metadata": {},
"source": [
"You can use this pad [https://pad.vvvvvvaria.org/any_one_day_this_chant_will_live](https://pad.vvvvvvaria.org/any_one_day_this_chant_will_live) to draft your chants."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.9"
}
},
"nbformat": 4,
"nbformat_minor": 5
}