Beruflich Dokumente
Kultur Dokumente
Background
Music visualization was popularized by the existence of windows media player. At that
time, it was considered a replacement for video. It was interactive and was default in every PC
out there. It generates a certain animated imagery or illustration based on the audio source
provided, in this case music. It usually is generated in real time, syncs, rendered and reacts with
the music as we’re hearing them live. There are various techniques that can be used to visualize
the music. It ranges from simple ones, to composite effects. Composite effects are audio
visualization effects that are combines various simple visualization algorithms to create a
visualization that is much more complex. Most of the time though, the music’s frequency (pitch)
and amplitude (loudness) are the two main determining factors. Since music is a subjective
matter, to us an effective music visualization is one that attains a high degree of correlation
between the music’s frequency and amplitude. The music needs to correlate and make sense to
the music.
In this computer graphics project, we were asked to create something that uses JavaScript
music visualization based on Three.js. Early on, we learned Babylon.js, we found that Babylon.js
even though it has a lot o easy sources to learn from, and is still actively developed, for our uses
Three.js was much more convenient. It is much more feature rich ranging from effects, scenes,
cameras, animations, lights, materials, shaders, objects, geometry, data loaders, utility, and much
more. To add to that, it is better documented than Babylon.js, but our pivoting reason was that it
could support most industry standard file formats, this proved useful as we needed to import
audio files and we wanted to minimalize the off chance of a troubleshoot happening.
3
Problems
There were some problems that we had encounter during the creation of this project. Our
aesthetic was black and white, we wanted to make a minimalist visualization with a higher
complexity to it. We didn’t think of adding various colors to aid the visualization. Another
problem is with a live microphone feed to the computer. At first, we wanted to be able to
visualize a live microphone feed to the computer, however we didn’t manage to get it to work.
Adding the camera shake proved to be tricky as well, but we managed to figure that out.
Related Work
To our best knowledge, there are Three.js based audio visualizer projects. One of them is
a project created by Janine, a student of Fullstack Academy on September 29, 2017. She created
the project using the JavaScript library, Three.js, as well as the Web Audio API. In this project
users can choose four songs that exemplify the different average frequencies and can also decide
how wide the visualization will appear on screen. The height of each cube is directly mapped to
an array of frequency data available at every moment, and the color changes based on the
https://www.youtube.com/watch?v=wlvLEdYmK1o.
The difference between her projects and ours, is the way the music was visualized. It was
visualized as a wave spectrum comprised of smaller cubes. Our project visualizes the music
Implementation
5
Code Snippet
<html>
<head>
<title>Audio</title>
<style>
body {
margin: 0;
color: black;
}
canvas {
width: 100%;
height: 100%;
}
#audio {
position: absolute;
bottom: 2%;
left: 50%;
transform: translateX(-50%);
width: 50%;
height: 5%;
outline: none;
}
#file {
position: absolute;
left: 50%;
transform: translateX(-50%);
}
</style>
</head>
<body>
<script src="/node_modules/three/build/three.min.js"></script>
<script
src="/node_modules/three/examples/js/controls/OrbitControls.js"></script>
<script>
6
var id = null;
var white = new THREE.Color('white');
var color = new THREE.Color('black');
var renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
// create scene
var scene = new THREE.Scene();
scene.background = new THREE.Color(white);
var camera = new THREE.PerspectiveCamera(100, window.innerWidth /
window.innerHeight, 0.1, 2000);
camera.position.set(0, 0, 600);
camera.updateProjectionMatrix();
function render() {
// get the frequency of the sound
controls.enabled = true;
analyser.getByteFrequencyData(data);
var avgdata = avg(data);
var lowerHalfArray = data.slice(0, (data.length / 2) - 1);
var upperHalfArray = data.slice((data.length / 2) - 1, data.length -
1);
var lowerAvg = avg(lowerHalfArray);
var upperAvg = avg(upperHalfArray);
9
camera.position.z += value3;
camera.strength -= camera.damper;
}
id = requestAnimationFrame(render);
controls.update();
renderer.render(scene, camera);
}
function max(params) {
return Math.max.apply(Math, params);
}
function avg(params) {
var sum = 0;
for (var i = 0; i < params.length; i++) sum += params[i];
return sum / params.length;
}
</script>
</body>
</html>
11
https://github.com/asokawotulo/CG-Final-Project/blob/master
https://threejs.org/docs/index.html#manual/en
https://threejs.org/docs/#api/en/audio/AudioAnalyser
https://www.youtube.com/watch?v=wlvLEdYmK1o