In this lesson, we will explore the importance of understanding ocean conditions when creating animations of ocean creatures. The movement and behavior of marine life are heavily influenced by factors such as water temperature, salinity, currents, and depth. By incorporating these variables into your animations, you can create more realistic and accurate depictions of ocean creatures.

One key aspect to consider is how different ocean conditions affect the movement of your creature. For example, a deep-sea creature living in colder, darker waters will have a slower and more sluggish animation compared to a tropical fish in warm, shallow waters. By studying the habitats of different marine species, you can adjust the speed and fluidity of your creature's movements to match their natural environment.

Additionally, understanding ocean conditions can help you create more dynamic and visually appealing animations. By incorporating details such as water clarity, wave patterns, and light penetration into your designs, you can add depth and realism to your creature's environment. This attention to detail will enhance the overall quality of your animations and draw viewers into the fascinating world of ocean life.

The next part of creating a GIF is making it move based on the motion of Wonder. You will gather data from the SenseHat’s movement sensors and connect that data to the code we just wrote to control the orientation of your image. Let’s start with commenting out the code we just wrote so that we can focus on the movement sensors for a few minutes. Go ahead and move your second set of quotation marks from before #### Rotate our creature to after the last sleep(1) command so that our rotation code is now included in our massive comment.

Before we jump into the sensor code, we need to understand the data we are about to be grabbing. We will work with three types of motion: roll, pitch, and yaw. These refer to movement around the three axes of three dimensional space, but it is easier to imagine the movement of an airplane. Even better if you stand up and pretend to be an airplane yourself. Here are the three movements in terms of a plane’s motion:

  • Roll: if you hold your arms out at your sides, when you lean to the left and right and your arms rock up and down, you are rolling.
  • Pitch: going into a nose dive or pulling up to gain altitude are examples of pitching. Bending forward at the waist or leaning backwards simulates pitch.
  • Yaw: keep your arms held out level and turn left and right with your waist, that is yaw. On Wonder, the we pitch as we move forward and go up and down over waves. We experience roll from waves too and when the wind fills our sails and causes the boat to lean away from the wind. Yaw happens too as a result of both wind and wave action and is what we control with our rudder when we turn. An airplane uses both roll and yaw when it turns but a boat mostly uses just yaw since we are on the flat surface of the ocean.

While this all may be easier to visualize with a boat or a plane, what about our Raspberry Pi? Which side of the Pi would the wings come out of? Which side of the Pi does it consider to be its nose/bow and which end is the tail/stern? We have to figure this out in order to know how our Raspberry Pi is measuring our environment. What the Raspberry Pi calls “pitch” might actually be “roll” on Wonder based on how we installed the Raspberry Pi. Let’s figure it out.

The command we are going to be using is get_orientation_degrees(). The SenseHat uses three different sensors—accelerometer, gyroscope, and magnetometer—to determine its orientation in space. If we call this function, we will get all three roll, pitch, and yaw measurements. First, create a new section in your code at the end of your program. Let’s store the results of this in a variable called orientation and print that value to the console so that we can see the format of the data:

#### Rotation sensors
orientation = sense.get_orientation_degrees()
print(orientation)

You should get the following result:

{'roll': 0.322104811027352, 'pitch': 353.6768392508526, 'yaw': 113.6919684709606}

Depending on how your Raspberry Pi is sitting, the numbers will probably be different. What we have here is a data structure called a “dictionary.” Just like a word dictionary, you can look things up in a computer dictionary if you know the “key” of what you are looking for. In this case, we can look up “pitch” in our dictionary by adding two more lines of code.

#### Rotation sensors
orientation = sense.get_orientation_degrees()
print(orientation)
pitch = sense.get_orientation_degrees()["pitch"]
print(pitch)

This should result in the following:

{'roll': 2.347089494116393, 'pitch': 352.7972112041584, 'yaw': 107.43045402105395}
352.6485410378156

Great, we can access our motion data! Now let’s return to the problem of figuring out how these three values relate to our Raspberry Pi. Let’s figure out how our Raspberry Pi orients itself. To do this, we can place our code inside a while loop so that we are getting live data from our sensors continuously.

#### Rotation sensors
orientation = sense.get_orientation_degrees()
while True:
    print(orientation)

Wait…moving the Raspberry Pi has no effect on this value—it just prints the same numbers endlessly. Why? Because we only told the SenseHat to get the orientation data once and then print it forever. We want to get the orientation data and print it forever. Slight edit:

#### Rotation sensors
while True:
     orientation = sense.get_orientation_degrees()
     print(orientation)

If you run this code and then carefully pick up your Raspberry Pi, you can rotate it in different ways to figure out how it measures these movements. Watch how roll, pitch, and yaw change as you try rotating the SenseHat in different ways.

What you should discover is that, for how the SenseHat is installed on Albus, the “roll” that the SenseHat is measuring is actually the pitch of the ship and the “yaw” and “pitch” sensors are responding to the ship’s roll. You should spend some time confirming this and making sure you understand how these are connected, since it will affect decisions you make in your code in the next few sections. Below is a diagram of the way the SenseHat measures these three motions.

In the next section, we will connect the SenseHat to the motion we created before.

Image from the Raspberry Pi Foundation’s Tightrope Project.