Web Contexts

Web Contexts Overview

In this project, you should create provocative web art which explores an issue or theme you care about. The artefact should be built using web technologies: HTML, CSS, JavaScript, Python, or other relevant tools. The artefact should take advantage of the context of the web by using techniques like interactive interfaces, personalisation, or connecting to other online content. It is expected your piece may be based on an adaptation of an existing example rather than creating one from scratch.

Work Mode



HTML, CSS, JavaScript, Python, and Twine

Reference material


A digital artefact accompanied by a learning journal and any supplementary development work.

Learning Outcomes

    • Design and build a simple interactive digital artifact using routine computational techniques and practices
    • Apply routine scripting techniques with HTML, CSS, and JavaScript to create an interactive digital piece
    • Exercise autonomy and initiative in solving design problems and manage a project to a specified deadline.

Assessment Criteria

You will be assessed on your ability to:

  • Produce valid, functioning HTML, CSS, JavaScript, Python program
  • Solve prescribed problem
  • Implement one of your own ideas
  • Document your learning process in your learning journal
  • Take the idea further at a conceptual level.

Submission Details

    • A statement of intention – what is the issue or theme? What is the message you want to put across? What is your approach to doing this?
    • Research and planning – Sketches, mockups, or plans for the design and content of the digital artifact. This should include the research you conducted on your chosen issue or theme.
    • Implementation – Working web prototype using HTML, CSS, JavaScript, or other relevant technologies.
    • Testing – Gather peer feedback on your artefact. The feedback should cover both technical implementation and artistic decisions.
    • Documentation of your process in your learning journal
    • Evaluation statement in your learning journal about where your idea could go next. Include what features would you implement in version 2 – you could support this with mock-up images. Research what existing web technology would be required to implement these features.

Monday 20th November 2017


Unfortunately I missed this as I was at my college graduation. Craig Steele is the lecturer leading this project and I’ve been told to look at his website by my classmates:


Look at ‘Session 1 – Exploring Interactive Experiences‘ on Craig’s website.

Tuesday 21st November 2017

Building For The Web


Building a website in HTML5
Style your own website CSS3
Add some interactivity JavaScript

HTML = Hyper Text Mark-up Language
HTTP = Hyper Text Transfer Protocol

Inventor of W3C regrets :// because it was difficult to locate/remember and was only chosen because it looks cool

30% of app profits go to Apple

Rooms/buildings of servers built in cold countries to save costs of cooling the rooms

Opera | Mozilla | Apple | Google   | Microsoft
Opera | Firefox | Safari | Chrome | Internet Explorer

Web developer tools add-on for Chrome

Press ‘P’ + ‘TAB’ for automatic <p> </p>
Follow the same format for other tags

HTML Tags:
<h1> = header 1
<h2> = header 2
<p> = paragraph
<img src”” alt””> = image
<a> = anchor
<href=””> = HTML
<div> = splits stuff into groups

CSS Tags:
State where you want to apply the design first, and then what design
Uses Amerian spelling
Uses words, hexidecimcal codes, RGB colour values
Use percentages as opposed to direct values

background-color: grey;
font-size: 52px;
color: hotpink;
font-family: ‘Arial’, sans-serif;
text-align: center;
width: 60%;
margin-left: auto; = automatically centers content …
margin-right: auto; = … no matter what screen size
margin-top: 5%;

Google fonts
Makes sure fonts are applicable to numerous devices

In settings -> CSS -> Add External CSS -> Quick-add -> Bootstrap 3
Responsive images

JavaScript Tags:
jQuery = $
In settings -> JavaScript -> Add External JavaScript -> Quick-add -> jQuery

console.log() = similar to println
function main() {
var location=””;
$.getJSON(location, function (data){
result = data.rss.channel … ;
parts = result.split(‘ ‘); = splits words by spacebars ‘ ‘
current_level = parts[6]; = counts from 0
$(‘#level’).text(current_level); = swaps #level with what’s held in current_level
if ( == “”){ = if something is equal to something …
// = comment


Look at ‘Session 2 – Zero to Digital: Recap‘ on Craig’s website.

Wednesday 22nd November 2017

Twitter Bots

Beatles soundtrack
Flow Machines

Following patterns, templates, rules

@pentametron – Retweets rhymes
@botcraig88 – Craig’s bot

Markov Chain
A way of predicting what’s next based on previous experience

The class pulled out their smart phones are were told to type out the beginning of some sentences and to then repeatedly press the middle button that predicts what word comes next. The two sentence starters we were given were “I hate it …” and “I need …” and then Eve suggested I try “Give me …“. There was no instruction as to when to stop so we were left to our own devices. Here is the result:

“I hate it when I get a follow back on my way home”
“I need a good time”
“Give me the most beautiful girl”

Create your own mashup bot
Using two contrasting Twitter accounts
Bring an author back from the dead
Use another text source as input?

My Twitter Bot

@elliot4shaw is me

@Bottboy is Ankita


Look at ‘Session 3 – Twitter Bot‘ on Craig’s website.

Thursday 23rd November 2017

Simulating the World (in Emoji)

“Our world is run by complex systems. We may feel helpless to change them or understand them but I believe all of us can — and must — learn to think in systems. That’s where simulations come in! It’s easier to understand a system if you can see the system, or better yet, play with the system.” – Nicky Case


My example: http://ncase.me/simulating/model/?remote=-KzpPR0fbQzJt1NnYbdw

Friday 24th November 2017

Max MSP Workshop

Max = numbers
MSP = music

Start a new ‘patch’

Lock/Unlock patch = Ctrl + Click or Ctrl + e

Musical note on the left bar to input sound
Can click and drag own sound from folder
If dragged over a sound clip it will replace it

Easy Digital Analogue Converter
Click to turn on and it will turn blue and power logo in bottom right turns blue

Click object and press ‘inspect’ on the right side

Hot and cold inlets

Grey chord = sending numbers
Stripy yellow and black chord = audio
Stripy green and black chord = video
Chords do not mix

Anything with a ~ works with audio

speed $1
$1 = like a variable means you can input any number (in this case from a slider)

n = new object
m = message/text
t = toggle
b = bang
c = comment in presentation mode

‘pitchshift $1’ won’t work without ‘timestretch $1’

Film reel on the left bar to input video
Jitter = moving image
Right click jit help
Jit.window for a black box that displays the video when you link it to a file via a chord

Max MSP Jitter

Metro ???? = metronome in microseconds

cycle~ = tone

Teensy to Max Serial to Teensy
Can send OSC over wifi to Teensy
Can also use UDP

Max MSP Keyboard

‘kslider’ = keyboard

MIDI is a way of communicating

Can send between patches by using, for example, ‘r miditrigger’ and ‘s miditrigger’

MINIX Neo Z83-4
A mini computer that you can use in installations
About £150

Monday 27th November 2017


“This project has to make the most of what is online.” – Craig Steele

Out of the things we looked at last week I’ve found myself more drawn to ‘Simulating the World (in emoji)’ and have been experimenting with the software.

“Think in systems”

– Nicky Case

I’ve decided to simulate life.

Simulations can help explain a system, but they’re also great for exploring systems. I’m hoping this concept will allow me to explore the cycle of life and what effects it.

I looked at many emoji websites and decided upon http://getemoji.com to get my emoji’s since I’m on a Windows computer. It’s easy just to copy and paste the emoji’s I want into the software. I didn’t realise how much the choosing of emoji’s would influence my project.


As you can see, the emoji’s have been updated to include a whole spectrum of families. This changed the dynamics of my original thoughts because it meant that I wasn’t confined to just using a heterosexual family. I decided not to label my humans as either male or female, and that I would make it possible for a human to get with ANY other human. By definition this would make them bisexual, not pansexual if we’re getting that technical, as the emoji’s are simply working off appearances. This project has become very LGBT+ friendly, but hey, that’s how life should be.

Unfortunately I didn’t realise how much more work that would leave me. It meant that I had to individually code each possibility of each human getting together, and then onto what type of family they could progress on to. Each scenario (Human 1 + Human 2, 1+1, 2+2) then led to a couple, and then a progression stage I’ve called ‘Kissing’ (later changed to ‘In Love’), and then they’ll go on to make a family.

Edit 1

Here is my progress:


In a future simulation I could develop it further and introduce items such as food and say that they can only eat if they are in contact with the food, but then this becomes a simulation for survival.

I also realised that I am confined by the single squares that make up the grid. This means that I can only have one emoji per square. I could however have two emoji’s but this would cross over into the neighbouring square and could overlap another emoji. This became a problem when trying to simulate marriage because there wasn’t a single emoji that I was happy with that represent two humans getting together. I could only find 👰 🤵 but that would cover two squares. I’ll either have to wait for an emoji update that would have a suitable emoji or wait for a software update that would allow for bigger squares. I did however use ☠️☠️ for a dead couple but I made it so that it disappears almost immediately so it doesn’t interfere with other emoji’s for too long.

Tuesday 28th November 2017

Emoji Realisation

One thing I didn’t realise would be how viewing my simulation on different platforms could influence the outcome. Different platforms, such as Windows or Apple, view emoji’s differently.

Family Emoji

I first experienced this with my favourite full-moon-face emoji. I would use it constantly on my Apple iPhone which would portray the desired cheeky emotion (the second one in the image down below), but when carried onto other platforms such as my Facebook status’ or Facebook Messenger I’d get completely different emoji’s which would portray a different emotion (like that funny blue one below). This would make me resort to an alternative emoji.

Moon Emoji

A prominent example is the pistol. The majority of platforms view it as a real gun, whereas some view it as a water pistol. Depending on the context, this could really change the outcome of the conversation.

Gun Emoji


This has already effected me because there were some emoji’s that I couldn’t use because when I copied them over to the simulation they wouldn’t copy correctly. Some female emoji’s would paste as the male version followed by the female sign.

Unicode. (2017). Full Emoji [Online] Available at: http://unicode.org/emoji/charts/full-emoji-list.html [Last accessed 30th Nov 2017]

Here is my progress on my simulation:


Wednesday 29th November 2017


Craig and I came across a problem with the software as he was assisting me on a logical approach to a scenario that I wanted help with. We realised that an emoji can’t turn into another emoji AND move in the same command, even though the command would let us write that. This held me back a bit and meant that some of ideas wouldn’t work, but to overcome the simple task of getting two humans together we had to introduce a middle step. We had to make them turn into similar love hearts when they came into contact they were interested in, and then them two love hearts would merge together to create a couple. Unfortunately this meant that some other ideas that I had aren’t possible without interfering with other scenarios and so I’ll have to wait until Nicky Case updates the software so I’m able to carry out these tasks. Craig has suggested that we get in touch.

As for the simulation I want people to be able to interact with it, and so I’ve written an introduction to explain what the users will be seeing on screen:


This is a simulation of life cycles.

When humans encounter other humans they have a chance to get together, and then they have a chance to extend their family. In this simulation everyone is bisexual.

All single humans move around where as humans that have got together do not as they have settled down with each other.

Unfortunately they also have a chance of passing away no matter what stage of their life they are at, and I have included the Grim Reaper 👹 who moves around killing people and Angel 👼 who aim to revive the dead.

The world starts with a mixture of humans that are at various stages of the life cycle.

The commands below look very intimidating but feel free scroll down each of the emoji’s and manipulate the values to see what attributes of the cycle of life they effect.

I’ve then listed some tasks for the users to complete if they wish. This will give them the opportunity to manipulate what they see on screen and to see what different components of the coding effects:


1) Go through the humans 👦👩 and the couples 💑👩‍❤️‍👩👨‍❤️‍👨💏👩‍❤️‍💋‍👩👨‍❤️‍💋‍👨 and increase the chance of the Grim Reaper 👹 catching them. What do you think will happen?

2) Under the Baby 👶 emoji, manipulate the percentage of it growing up to be either Human 1 👦 or Human 2 👩. How does this effect the cycle?

3) Is the population of humans increasing/decreasing/remaining the same? What could you do to change that?

4) Can you make the human race become extinct?

5) What other attributes could be added to further the idea of the cycle of life? (e.g: marriage, illness)

Then for those who are comfortable with the software, I’ve simply listed the emoji’s that I would recommend manipulating:


👹 Grim Reaper
☠️ Dead
👼 Angel
👦 Human 1
👩 Human 2
💑 Dating 1
👩‍❤️‍👩 Dating 2
👨‍❤️‍👨 Dating 3
💏 In Love 1
👩‍❤️‍💋‍👩 In Love 2
👨‍❤️‍💋‍👨 In Love 3
👪 Family 1
👩‍👩‍👦 Family 2
👨‍❤️‍💋‍👨 Family 3
👶 Baby
👨‍👩‍👧‍👦 Family 4
👩‍👩‍👦‍👦 Family 5
👨‍👨‍👧‍👦 Family 6

I asked my flat mate Dimitri to test my simulation and he managed to get on with it well but asked me how to get back to the original simulation after he had edited some values. This led me to add an extra paragraph to the end of my introduction:

If you wish to undo any changes you’ve made and return back to the original simulation then scroll down to the very bottom and press “undo all changes”.



Thursday 30th November 2017


I feel that my life simulation using Nicky Case’s online simulation software has turned out quite successful in showing the process from a baby 👶, to an adult 👦, to dating 💑, to falling in love 💏, to starting a family and having children 👪, and then death ☠️. Simulating this makes it easier for me and others understand the system, in this case the circle of life, because we can physically see it. We are then able to understand it further by playing with it, and seeing what attributes effect what parts of the system. All complex things have things in common and so that allows us to make hypotheses’ based on what we’ve seen.

As for future developments, I feel this would coincide with the continuous updating of emoji’s and the simulation software. I saw that there were families with just one parent, and so in the future I would like to add this in if partners were to either leave or pass away. The skin colour of emoji’s has also been a recent update and so that would also be an interesting thing to integrate within my simulation, especially to see the skin colour of their children. I could also add in extra steps of relationships, such as marriage, moving in together, getting pets, growing old together etc. I’d originally wanted to make a pregnancy stage and then I could factor in the chance of conception and birth but the only emoji I could find that would represent that was a biological woman 🤰  and that didn’t represent men that were going to have children.

Getting real data, for example the chance of a baby being male or female, was an idea that I could also implement in the future.


Friday 1st December 2017

The month of Christmas 😎


Today is the day that we presented our final result to Craig, the project lecturer, and Inga, the course leader. We each got 5/10 minutes each to present our work.


I presented the link above and my WordPress. I can’t remember exactly what was said but here are some points:

  • Inga didn’t realise how complex emoji’s were
  • She commends me for following through with ideas when I realised how complex they were as opposed to blocking them
  • She mentioned that in some ways the project was out of my hands because different browsers and platforms determined how my work came across
  • Craig was chuffed that I quoted him
  • He said that if I were to make a simulation of a water fight on an Apple product, it would come across as something very different to other platforms as it wouldn’t show water pistols, it would show guns.
  • Inga said she was glad that I mentioned about incorporating data into the simulation as that would make it not so random, but she doesn’t expect me to do so for a two week project
  • She said that it could make an interesting Honours project
  • They both said well done

I will most likely explore this again in the future!


Previous: Design Domain

Next: Living Walls


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a website or blog at WordPress.com

Up ↑

%d bloggers like this: