PRiSM JavaScript App Tutorial – Part III

2 September 2021

Dr Christopher Melen (PRiSM Software Research Engineer)

Welcome back

Welcome to the third and final part of PRiSM’s series of tutorials on audio application development using modern JavaScript. In the previous part we built the app’s audio engine using the Web Audio API, then added code to enable the user to interact with the application and control and modify the audio signal. In this final part we will be bringing our application to life with code for rendering audio waveforms using React.

Rendering Waveforms

We have the audio engine of our application in place and the controls to interact with it, and we could happily leave things there. If all we are interested in is audio output, and having a way to control it, our application is complete. Many audio apps go one step further, however, and render the audio data in some way. So let’s go ahead add this feature to our app as well.

Save the following code in a file called waveform.js in the src subfolder:

import React, { useLayoutEffect, useRef } from "react";
import { osc } from "./oscillator";

const Canvas = ({ hasAudio }) => {
  const canvasRef = useRef();
  const requestRef = useRef();
  
  const drawWaveform = () => {
    requestRef.current = requestAnimationFrame(drawWaveform);

    const canvas = canvasRef.current;
    const ctx = canvas.getContext("2d");
    const w = canvas.width;
    const h = canvas.height;
    ctx.clearRect(0, 0, w, h);
    
    const waveform = osc.getWaveform();
    
    ctx.lineWidth = 3;
    ctx.strokeStyle = '#5661FA';
    ctx.beginPath();

    const sc =  w / waveform.length;
    
    for (let i = 0; i < waveform.length; i++) {
      const x = i * sc;
      const y = ( 0.5 + (waveform[i] / 2) ) * h;
      if (i == 0) {
        ctx.moveTo(x, y);
      } else {
        ctx.lineTo(x, y);
      }
    }
    ctx.stroke();
  }

  useLayoutEffect(() => {
    const canvas = canvasRef.current;
    const ctx = canvas.getContext("2d");
    if (hasAudio) {
      requestRef.current = requestAnimationFrame(drawWaveform);
    }
    else {
      cancelAnimationFrame(requestRef.current);
      ctx.clearRect(0, 0, canvas.width, canvas.height);
    }
    return () => cancelAnimationFrame(requestRef.current);
  }, [hasAudio])

  return (
    <canvas id="canvas" width={800} height={500} ref={canvasRef} />
  )
}

export default Canvas;

We also need to import this component into our App component module, so add the following import statement at the top of src/app.js:

import Canvas from "./waveform";

Then add the necessary JSX so the component will be rendered. Slot the following in directly beneath the toolbar div (we’ll address the significance of the hasAudio prop later):

<Canvas { ...{ hasAudio } }/>

We render our waveform to a custom React component called Canvas (note the upper case), which wraps an HTML canvas element. Indeed the sharp-eyed might have noticed that sections of the above (for example the drawWaveform() function) look a lot like traditional JavaScript code, in that they seem to be directly modifying the DOM. Indeed this is exactly what is happening, which seems in direct opposition to React’s declarative model based on JSX. The code in drawWaveform(), for example, is exactly the kind of thing we might write if we were producing our app around 2011 (although likely with heavy use of jQuery). With React, however, it is still sometimes necessary to write code which manipulates DOM objects, such as the canvas element. This is not a defect with React, however, but shows how, despite its functional, declarative character, it explicitly accommodates more traditional rendering techniques where these cannot be avoided. Objects such as the canvas element have a naturally imperative rendering model, and React caters for this through its Hooks API, which we introduced in the previous tutorial. Here we will again leverage the useLayoutEffect() hook. Recall that code passed to this hook through its first argument, which must be a function, will run as a side-effect, after the main render. So we will be rendering our waveform as a side-effect.

This pattern of rendering to DOM elements as a component side-effect is not something new, however. It has been part of React since early on – for example it is commonplace to find class component lifecycle methods, such as componentDidUpdate(), running code which directly modifies the DOM. Access to DOM elements is achieved in class components using the ref pattern, in which a mutable reference to a DOM node is stored in an instance variable. A ref is actually an object, and the reference to the DOM node is stored in the ref’s current property. React provides an equivalent pattern for functional components through the useRef hook. We mentioned this hook briefly in the previous tutorial. useRef returns an object which can act somewhat like an instance variable, providing a mutable reference for anything we want to keep track of between renders – for example, we could use it to store a JavaScript object, a React component, or a reference to a DOM node. In our Canvas component we store a ref in a variable canvasRef, and pass that variable name as the value of the ref prop of the canvas component. Now, whenever we want to access the underlying DOM object, for example to draw the waveform, or modify its dimensions, we simply reference the value of this variable.

Animating Things

Sound consists of waves, which move, and since we want to create visual representations of such waves we might reasonably expect them to move also. How do we achieve this in JavaScript? The answer lies in the call to the requestAnimationFrame() at the start of our drawing function. This is part of the JavaScript Web API (it’s actually a method of the ubiquitous window object, so doesn’t need to be imported). Like many JavaScript functions and methods it takes a function as an argument. It calls this function, and returns a non-zero longinteger ID which can be used in order to cancel the call by passing it to another window object function, cancelAnimationFrame(). If we wish to animate things the callback passed to requestAnimationFrame() must itself call requestAnimationFrame(), so our callback function is defined to be recursive (passing itself to requestAnimationFrame(), as in the above code). Note that React is oblivious to these repeated calls, since they do not modify component state and therefore do not trigger any updates.

We start things off with an initial call to requestAnimationFrame(), passing in drawWaveform() as the callback. Due to its recursive nature our drawing function will then render repeatedly. Since we’re clearing the page at the start each time, with clearRect(), we will always render a new waveform, rather than drawing over what has previously been rendered. Notice that we store the returned id from requestAnimationFrame() in the current property of another ref object we have created, requestRef.

Remember from the previous tutorial that we added a control for starting and stopping audio output. It would clearly make sense that the rendered waveform should respond to this too. So we want to render the waveform when we click START, and clear it when clicking STOP. To manage the stopping and starting of audio we leveraged the useState hook, which gave us a state variable whose updates could be used to trigger the appropriate oscillator engine methods, as side effects. We also need the Canvas component to respond to the hasAudio state variable, so that it can render the waveform not just once but whenever the value of this variable changes. The way this can be achieved is by adding hasAudio to the Canvas component as a prop. Recall from the first tutorial that React components may update not only in response to changes in their internal state but also to changes in the values of arguments that are passed in, it’s props. So all we need to do is to add this variable as a prop to the Canvas component, then ensure that the Canvas component can respond appropriately to changes in this prop.

Now let’s look at how the has Audio prop is used inside the Canvas component. The following repeats the code for the useLayoutEffect() entry:

useLayoutEffect(() => {
  const canvas = canvasRef.current;
  const ctx = canvas.getContext("2d");
  if (hasAudio) {
    requestRef.current = requestAnimationFrame(drawWaveform);
  }
  else {
    cancelAnimationFrame(requestRef.current);
    ctx.clearRect(0, 0, canvas.width, canvas.height);
  }
  return () => cancelAnimationFrame(requestRef.current);
}, [hasAudio])

Notice how this useLayoutEffect() contains logic which depends on the hasAudio prop. Whenever hasAudio is updated, useLayoutEffect() will respond by firing its callback function, starting and stopping the waveform in response to its updated value. If the value of hasAudio is true then we call our drawing function, although instead of being called directly it is passed to requestAnimationFrame(), whose return value is then stored as the value of the current property of the requestRef. If, on the other hand, the value of hasAudio is false then we call cancelAnimationFrame() to cancel our drawing function, passing it the current id stored in requestRef. This id will likely have been stored there deep inside our recursive stack of calls to requestAnimationFrame(), each of which will return a fresh id. If we don’t do this then the only way to stop the recursion is by physically closing the browser tab or window!

useLayoutEffect() may also return a function, which it does in this case, which is automatically called immediately after the component is ‘unmounted’, or removed from the page. We can place a call to cancelAnimationFrame() there in order to ensure that the cycle of calls to our drawing function can be terminated not only through a user interaction but when our component is destroyed, which might happen for any number of reasons beyond the user’s control (failing to clean up component side-effects is a common cause of errors in React applications).

What Next?

Our app is now complete, but although it functions perfectly well it is still quite simplistic. We might wonder, therefore, if there is anything more we can do to improve it. For example, we haven’t done any work on styling it, beyond a rudimentary bit of CSS. Also at the moment our app is restricted to the web, and therefore dependent on a browser – but what if we wanted to release it as a standalone application? Modern JavaScript luckily provides solutions for both of these problems, and indeed many, many more.

Material-UI

Traditional web development split its resources between HTML for the page structure, JavaScript for event handling and CSS for styling. We saw in Part I how JavaScript gradually developed to the point where it was managing the creation of HTML. It is also possible to style React components with inline CSS in a style prop:

<div style={ {width: “300px”, margin: “10px”, backgroundColor: “yellow” color: “red” } }>
  <p>Hello, World!</p>
</div>

Material-UI is a React-based library which takes things much further, providing a range of custom components which implement Google’s Material design language. For example the AppBar component:

<AppBar position="static">
  <Toolbar>
    <IconButton edge="start" className={classes.menuButton} color="inherit" aria-label="menu">
      <MenuIcon />
    </IconButton>
    <Typography variant="h6" className={classes.title}>
      News
    </Typography>
    <Button color="inherit">Login</Button>
  </Toolbar>
</AppBar>

Integrating Material-UI is beyond the scope of this tutorial, but the website contains many easy to follow examples, and installing Material-UI is as simple as:

npm install @material-ui/core

There are a number of libraries available which offer custom components and other advanced styling features, but Material-UI is perhaps the most popular and certainly the easiest to use for a relative beginner with React.

React Native

Although React was initially designed with the web in mind, today it is possible to build React applications directly for native platforms, such as phones, tablets, and even traditional desktop computers. Although web-based React and React Native are very similar, they are different environments and are installed separately. Perhaps the most striking difference is that in React Native there is no HTML or CSS, which shouldn’t surprise us since React Native doesn’t target the web. It provides similar components, however, and one could be forgiven for mistaking React Native code for ReactJS. For example the following is contains examples of the View and Text components from React Native, which have HTML counterparts in the div and text elements:

import React, { Component } from 'react';
import { View, Text } from 'react-native';

export default class App extends Component {
  render() {
    return (
      <View style={styles.container}>
        <Text style={styles.intro}>Hello world!</Text>
      </View>
    );
  }
}

As with Material-UI, converting the Oscillator App from ReactJS to React Native is well beyond the scope of this tutorial, and would require quite a number of changes to the code (not to mention removal of libraries like Parcel).

We very much hope you enjoyed this series of tutorials, and indeed learnt something from them. Thanks for sticking with it! We hope to release more tutorials soon, so check back on the PRiSM website for regular updates. Also keep in touch via Twitter.

 

Also in this section...