top of page
Featured Posts

ICM | Final Project | Taylor Swift Deep Dive

Final project can be viewed here.

Taylor Swift - Deep Dive is a 5-part visualizations of Taylor Swift's body of work. It is also a part of a larger study on fandom culture (to be released soon).

The final website page was made using wix, with iframes of different 5 p5.js sketches.

 

Part 1. Musicality

For this chart, all of Taylor Swift's singles are mapped according to their genres, year released, keys, beats per minute (BPM), and peak chart position in the US.

Part of the challenge for this section was to decide how to meaningfully visualize the different aspects of each single. This is how I ended up mapping the data:

  • Year released: x-axis

  • Key: y-axis (like in a musical sheet)

  • BPM : beating circle

  • Genre: color

  • Chart Performance : size of circle

Once the visualization concept was decided, I created a JSON file with 3 singles' data for prototyping. The structure of the file is as following:

{ "data" : [ { "title" : "Tim McGraw", "year" : 2006, "album" : { "title" : "Taylor Swift", "artwork" : "assets/album_taylorswift.png" }, "bpm" : 152, "key": "C", "isMajor" : true, "genre" : ["country"], "peakchart" : 40, "artwork" : "assets/single_timmcgraw.png", "lyrics": "He said the way my blue eyes shined\r\nPut those Georgia stars to shame that night\r\nI said: \"That's a lie.\"\r\nJust a boy in a Chevy truck\r\nThat had a tendency of gettin' stuck\r\nOn back roads at night\r\nAnd I was right there beside him all summer long\r\nAnd then the time we woke up to find that summer had gone\r\nBut when you think \"Tim McGraw\"\r\nI hope you think my favorite song\r\nThe one we danced to all night long\r\nThe moon like a spotlight on the lake\r\nWhen you think happiness\r\nI hope you think \"that little black dress\"\r\nThink of my head on your chest\r\nAnd my old faded blue jeans\r\nWhen you think Tim McGraw\r\nI hope you think of me\r\nSeptember saw a month of tears\r\nAnd thankin' God that you weren't here\r\nTo see me like that\r\nBut in a box beneath my bed\r\nIs a letter that you never read\r\nFrom three summers back\r\nIt's hard not to find it all a little bitter sweet\r\nAnd lookin' back on all of that, it's nice to believe\r\nBut when you think \"Tim McGraw\"\r\nI hope you think my favorite song\r\nThe one we danced to all night long\r\nThe moon like a spotlight on the lake\r\nWhen you think happiness\r\nI hope you think \"that little black dress\"\r\nThink of my head on your chest\r\nAnd my old faded blue jeans\r\nWhen you think Tim McGraw\r\nI hope you think of me\r\nAnd I'm back for the first time since then\r\nI'm standin' on your street\r\nAnd there's a letter left on your doorstep\r\nAnd the first thing that you'll read\r\nIs when you think \"Tim McGraw\"\r\nI hope you think my favorite song\r\nSomeday you'll turn your radio on\r\nI hope it takes you back to that place\r\nWhen you think happiness\r\nI hope you think \"that little black dress\"\r\nThink of my head on your chest\r\nAnd my old faded blue jeans\r\nWhen you think Tim McGraw\r\nI hope you think of me\r\nOh, think of me\r\nMmm\r\nYou said the way my blue eyes shined\r\nPut those Georgia stars to shame that night\r\nI said, \"That's a lie.\"" }

]

}

Several things to note:

A summary of the code logic/sequence:

  1. In preload(), the sketch loads the JSON file first.

  2. Once JSON file is loaded, it loads and stores image in an array.

  3. In setup(), the sketch creates objects from the entirety of JSON's data and the stored images. This is so that the object could save other attributes like x-position, color of circle, etc.

  4. Next, the sketch draws the axes and saves the coordinates for each year and key.

  5. For each single / object:

  • X position depends on year.

  • Y position depends on key.

  • Color is mapped according to genre. Country is more yellow, pop is pink, rock is blue, etc. Some singles have multiple genres, so the sketch adds each genre's colors into a final set of RGB values.

  • Size is mapped inversely to peak chart position.

  • For the beating circle, single's BPM is used in conjunction with sin-frameCount.

  1. Data highlighting:

  • The sketch checks the distance of each data point to mouseX & mouseY.

  • If curser is within a data point's circle, display the single's artwork, info, and highlight the year and key.

  1. Data filtering:

  • Using two selectors, the sketch toggles 2 booleans: isGenreDisplaying & isBPMDisplaying. The highlighting code checks these two variables before displaying artwork & info.

(I also needed to data mine all 32 singles of Taylor Swift's singles).

Source code for this part can be found here.

 

Part 2. Lyricism

It is basically a word cloud of all of Taylor Swift's lyrics in her singles.

The sketch loads the same JSON file part 1 loads. The crucial part of the code was found on stackoverflow.

var freq = wordFreq(allLyrics); Object.keys(freq).sort().forEach(function(word) { //console.log("count of " + word + " is " + freq[word]); textSize(freq[word]*0.4); text(word, xcount*x_space, ycount*y_space); });

function wordFreq(string) { var words = string.replace(/[.]/g, '').split(/\s/); var freqMap = {}; words.forEach(function(w) { if (!freqMap[w]) { freqMap[w] = 0; } freqMap[w] += 1; });

return freqMap; }

I'm not familiar with non-p5 syntax, but wordFreq returns the frequency of each word, discounting some escape characters.

What I did then was to draw each word and vary the size according to frequency.

Source code can be found here.

 

Part 3. Self-Branding

The sketch for Self-Branding was heavily based on my previous sketch.

It takes color average of Taylor's music videos. In the final page, it scans two videos at a time due to slow loading, but in my local server I could do four at a time. The video sizes are also based on views on Youtube.

The trickiest part of the sketch was figuring out how to load 32 music videos progressively. This is my sketch's code sequence:

  1. In preload(), the sketch loads a JSON file with paths to the videos.

  2. It then loads only the first 2 (or 4) video in the list, and put it into an array called playingVideo[].

  3. In setup, the sketch plays the videos in playingVideo[].

  4. Every second, it scans the videos' pixels, calculate the averages, and draw them as long rectangles.

  5. When button is pressed, the sketch switches playingVideo[]'s source (or .src) to the next 2 (or 4) videos in the list.

  6. Once switched, it recalculates the width based on YT views and resets the color average array.

  7. The color scanning gets called every second regardless.

Initially I used createVideo() 32 times and that crashed my browser without fail. Then I figured I could just switch the sources and not create too many video DOM objects.

Source code can be found here.

 

Part 4. Famous Relations

After scrapping instagram & twitter, I created a JSON file with the following structure:

{ "name" : "Tom Hiddleston", "job" : "Actor", "type" : "Ex-Boyfriend", "ig_fol" : 3800000, "t_fol" : 3400000, "image" : "assets/rel_tom.png" }

I did not want to manually code the position of each person nor arrange them neatly in a grid, so I took Shiffman's Random Circles with No Overlap code to assign each person's a random coordinate.

The random coordinate and color are constrained by relationship type (ex-boyfriends on top left, enemy bottom left, squad member on the right half).

Source code can be found here.

 

Part 5. NYT Mentions

I downloaded some JSON file from NYT's API, and use the word count value to map a Y-axis position.

Source code can be found here.

Recent Posts
Search By Tags
No tags yet.
Related Posts
bottom of page