Curtiss unveils funky new V8 battery in upcoming 217 hp electric motorcycle

Standard

Curtiss Motorcycles has been on a roll with innovative electric motorcycle designs. Now they’ve released an update to their upcoming 2020 Zeus electric motorcycle with a V8 styled battery.

Curtiss Motorcycles

Curtiss Motorcycles was previously known as Confederate Motorcycles. After rebranding and switching to all-electric motorcycles in 2017, the company released a series of prototype designs for their Zeus electric motorcycle.

The original prototype immediately won praise and awards after its unveiling in 2018.

Ever since, Curtiss has rolled out a series of updates and design changes to tweak the bike before it heads to full production.

Late last year the company divided the electric motorcycle world with the introduction of their V8 battery concept. While some praised the visually striking design, others derided the throwback to the dying gas-powered motorcycle era.

But the naysayers haven’t stopped Curtiss, who is now charging forward with the V8 battery design. The company’s latest update to the Zeus electric motorcycle shows the flared, cylindrical battery cases in the iconic V8 style.

curtiss zeus v8 electric motorcycle

As explained by Curtiss Designer Jordan Cornille:

“With the battery cells packaged inside eight cylindrical towers configured in a flaring radial ‘V’ pattern, we’re not only able to tap into Glenn’s iconic V8 form language, but we’re also able to achieve maximum battery cooling efficiency. In this case, there exists no compromise between form and function.”

Those eight cylinders hold 16.8 kWh of battery, which is more capacity than most other electric motorcycles. Without accessory or add-on packs, Zero’s largest battery currently tips the scales at 14.4 kWh.

Of course with 217 hp (160 kW) of motor power, the Zeus electric motorcycle will need all the battery capacity it can get. For comparison, Zero’s latest SR/F electric streetfighter has only half the motor power of the 2020 Zeus V8 electric motorcycle.

Curtiss was originally using motors from Zero in a dual motor setup on the Zeus, but ultimately switched to a proprietary Yasa P400 R series motor.

The Curtiss Zeus V8 is slated to enter production next year. It won’t come cheap though, with a current price of $75,000. You could buy two Harley-Davidson LiveWire electric motorcycles for that price and still have enough left over to pick up a 150 mph (241 km/h) Lightning Strike electric motorcycle.

The Zeus may sound expensive… and it is. But at least you’ll be getting an extremely unique bike for that cash. The front suspension is an eye-catching aluminum girder fork, the chassis is a hand welded titanium/chromoly tubular frame with machined 6061 aluminum fusion and the wheels are carbon fiber.

Curtiss Motorcycle’s previous V8 electric motorcycle rendering and the original bike that inspired it

Electrek’s Take

Is all of that strictly necessary? Of course not.

But the Zeus is an extremely powerful rolling work of art with high performance to match the high price. If you’re looking for a urban runaround, this isn’t the bike for you.

If you’re looking for a status symbol that actually comes with some pretty insane engineering and electronics, not to mention enough power and torque to peel the rubber off your tire, the Zeus might be for you.

I know that I’ll never be able to own a bike like this, but I still love that it exists. I’m holding out hope that I’ll get to try one someday. If that ever happens, I’ll take all of you Electrek readers along with me for a first ride report!


Subscribe to Electrek on YouTube for exclusive videos and subscribe to the podcast.

from Electrek http://bit.ly/2NyXW45
via IFTTT

Arduino Robot Arm and Mecanum Wheels Platform Automatic Operation

Standard

In this tutorial I will show you how I made my Mecanum Wheels Robot Platform from my previous video, to work together and operate automatically with my 3D printed Robotic Arm, also an Arduino project from one of my previous videos.

Overview

So, we can control the Mecanum wheels robot with the custom-build Android application the same way as explained in the previous video. In addition to that, now the app also has buttons for controlling the robot arm.

Arduino Mecanum Wheels Robot and Robotic Arm Operating Automatically

The original robot arm control app actually had sliders for controlling the robot joints, but that was causing some problems with the arm stability. In this way the arm works much better, so therefore I will provide this updated version of the robot arm control app and the Arduino code to the original robot arm project as well.

Nevertheless, the coolest feature of this robot is the ability to store the movements and then automatically repeat them.

Arduino Robot Automatic Operation

Using the Save button, we can simply store the positions of the motors for each step. Then we just need to click the Run button and the robot will automatically repeat the stored movements over and over again.

Building the Arduino Robot

Ok, so here I have the Mecanum wheels platform already assembled and you can find all details about it in my previous video.

Robotic Arm 3D Printed parts

Also, here I have the 3D printed parts of the robot arm and the servo motors and now I will show you how to assembly them. Here’s the 3D model of this project.

You can download the 3D model and the STL files needed for 3D printing below.

STEP file:

Icon
Arduino Robotic Arm and Mecanum Wheels Platfrom 3D Model STEP File 3.93 MB

STL files for 3D printing:

Icon
Arduino Robotic Arm and Mecanum Wheels Platfrom STL Files 943.42 KB

The first servo of the robot arm will be directly mounted on the top cover of the mecanum wheels platform.

I marked the location, and using a 10mm drill I made several holes.

making opening on the top cover

Then using a rasp, I cut through the holes and then fine-tuned the opening for the servo. I secured the servo to the top plate using four M3 bolts and nuts.

Securing the servo to the top plate

Then on this output shaft of this servo, using the round horn that comes as accessory with the servo, we need to attach the next part or the waist of the robot arm. However, we can notice that in this way the part stays around 8mm above the plate. So therefore, I attached two pieces of 8mm MDF boards, so the waist part can slide on them and so the joint will be more stable.

Adding 8mm MDF boards to the top plate

The round horn is secured to the waist part using the self-tapping screws that come as accessories with the servo, and then the round horn is secured to servo shaft using the appropriate bolts that also come with the servo.

Securing the round horn to the servo

Next we have the shoulder servo. We simply put it in place and secure it to the 3D printed part using self-tapping screws.

Securing the shoulder servo of the Arduino Robot

The round horn goes on the next part, and then the two parts are secured to each other using a bolt on the output shaft of the servo.

Assembling the Arduino Robot

We should note that before securing the parts, we need to make sure that the part has the full range of motion. Here I also added a rubber band to the shoulder joint so that it gives a little bit help to the servo, because this servo carries the weight of the rest of the arm as well as the payload.

rubber band for helping the servo motor

In similar way, I assembled the rest of the robot arm.

Arduino Robot - Mounting the Wrist Joint Servo Motor

Next, we need to assemble to gripper mechanism. The gripper is controlled with an SG90 servo motor, on which, first we attach a custom designed geared link. We pair this link with another geared link on the other side, which is secured using M4 bolt and nuts. Actually, all other links are connected using M4 bolts and nuts.

Robot Arm Gripper Mechanism Assembly

The 3D model of the gripper originally has 3mm holes but I didn’t have enough M3 bolts, so therefore I expanded the holes using 4mm drill and used the M4 bolts instead.

Once I assembled the gripper mechanism, I secured it to the last servo and so the robot arm was completed.

Robot Arm Finished Assembly

Next I did some cable management. I passed the servo wires through the specifically designed holes of the robot arm. Using a 10mm drill I made a hole on the top plate so that the wires can pass through.

Arduino Robot Wiring Managament

Using a zip-tie I secured all the wires together, and now what’s left is to connect them to the Arduino board.

Arduino Robot Circuit Diagram

Here’s the circuit diagram of this project and how everything needs to be connected.

Arduino Robot Arm and Mecanum Wheels Platform Circuit Diagram

You can get the components needed for this project from the links below:

*Please note: These are affiliate links. I may make a commission if you buy the components through these links. I would appreciate your support in this way!

In the previous tutorial I explained how the Mecanum wheels robot part works and also showed you how I made a custom PCB for it.

Custom PCB Design for the Arduino Robot - Arduino Mega Shield

I included a 5V voltage regulator on this PCB so that we can make this project, or connect the servo motors because they work at 5V. The voltage regulator is the LM350, which can handle up to 3 amps of current. All six servos of the robot arm can draw from around 2 amps to 3 amps of current, which means that it can handle them but that will cause the regulator to get very hot.

12V DC fan and heat sink for cooling the LM350 voltage regulator

Therefore, I attached a heat sink to it, as well as a small 12V DC fan to blow some air, because the heat sink itself wasn’t enough the cool the regulator.

I connected the servos signal wires to the Arduino digital pins from number 5 to 10, and for powering I used the 5V pin header on the PCB. Finally, I pushed all the wires inside the platform, and secured the top plate to it using the two nuts.

Arduino Robotic Arm and Mecanum Wheels Platform Electronics

And that’s it, now we are done with this project.

Arduino Mecanum Wheels Robot And Robot Arm

Arduino Code

What’s left is to take a look how the Arduino code and the Android application work. As the code is a bit longer, for better understanding, I will post the source code of the program in sections with description for each section. And at the end of this article I will post the complete source code.

So first we need to define the 6 servo, the 4 stepper motors and the Bluetooth communication, as well as define some variables need for the program below. In the setup section we set the maximum speed of the steppers, define the pins to which the servos are connected, begin the Bluetooth communication and set the robot arm to initial position.

#include <SoftwareSerial.h>
#include <AccelStepper.h>
#include <Servo.h>

Servo servo01;
Servo servo02;
Servo servo03;
Servo servo04;
Servo servo05;
Servo servo06;

SoftwareSerial Bluetooth(A8, 38); // Arduino(RX, TX) - HC-05 Bluetooth (TX, RX)

// Define the stepper motors and the pins the will use
AccelStepper LeftBackWheel(1, 42, 43);   // (Type:driver, STEP, DIR) - Stepper1
AccelStepper LeftFrontWheel(1, 40, 41);  // Stepper2
AccelStepper RightBackWheel(1, 44, 45);  // Stepper3
AccelStepper RightFrontWheel(1, 46, 47); // Stepper4

#define led 14

int wheelSpeed = 1500;

int lbw[50], lfw[50], rbw[50], rfw[50]; // arrays for storing positions/steps

int servo1Pos, servo2Pos, servo3Pos, servo4Pos, servo5Pos, servo6Pos; // current position
int servo1PPos, servo2PPos, servo3PPos, servo4PPos, servo5PPos, servo6PPos; // previous position
int servo01SP[50], servo02SP[50], servo03SP[50], servo04SP[50], servo05SP[50], servo06SP[50]; // for storing positions/steps
int speedDelay = 20;
int index = 0;
int dataIn;
int m = 0;

void setup() {
  // Set initial seed values for the steppers
  LeftFrontWheel.setMaxSpeed(3000);
  LeftBackWheel.setMaxSpeed(3000);
  RightFrontWheel.setMaxSpeed(3000);
  RightBackWheel.setMaxSpeed(3000);
  pinMode(led, OUTPUT);
  servo01.attach(5);
  servo02.attach(6);
  servo03.attach(7);
  servo04.attach(8);
  servo05.attach(9);
  servo06.attach(10);
  Bluetooth.begin(38400); // Default baud rate of the Bluetooth module
  Bluetooth.setTimeout(5);
  delay(20);
  Serial.begin(38400);
  // Move robot arm to initial position
  servo1PPos = 90;
  servo01.write(servo1PPos);
  servo2PPos = 100;
  servo02.write(servo2PPos);
  servo3PPos = 120;
  servo03.write(servo3PPos);
  servo4PPos = 95;
  servo04.write(servo4PPos);
  servo5PPos = 60;
  servo05.write(servo5PPos);
  servo6PPos = 110;
  servo06.write(servo6PPos);
}

Then in the loop section we start by checking whether there is any incoming data.

// Check for incoming data
  if (Bluetooth.available() > 0) {
    dataIn = Bluetooth.read();  // Read the data

This data comes from the smartphone or the Android app, so let’s take a look what kind of data it is actually sending. The Android app is made using the MIT App Inventor online application. It consists of simple buttons which have appropriate images as background.

Arduino Robot Custom Build Android Application for Smartphone Control

If we take a look at the blocks of the app, we can see that all it does is it sends one-byte numbers when the buttons are clicked.

Robot Control App Blocks - Code

So, depending on clicked button, we tell the Arduino what to do. For example, if we receive the number ‘2’ the mecanum wheels platform will move forward, using the moveForward custom function.

if (dataIn == 2) {
      m = 2;
    }
//
if (m == 2) {
      moveForward();
    }

This custom function sets all four stepper motors to rotate forward.

void moveForward() {
  LeftFrontWheel.setSpeed(wheelSpeed);
  LeftBackWheel.setSpeed(wheelSpeed);
  RightFrontWheel.setSpeed(wheelSpeed);
  RightBackWheel.setSpeed(wheelSpeed);
}

For moving in any other direction, we just need rotate the wheels in the appropriate directions.

For controlling the robot arm, we use the same method. Again, we have buttons in the app and when holding the buttons, the robot arm joints move in the particular direction.

Robot Arm Control App

As I mentioned earlier, in the original Robot Arm control app we were using sliders for controlling the positions of the servos but that was causing some problems because in that way we had to send Text to the Arduino, instead of 1-byte number. The problem is the Arduino sometimes misses the Text coming from the App and makes error or the Robot arm shakes and behaves abnormal.

In this way we simple send a single 1-byte number when a particular button is touched down.

Android app blocks for the servo control of the Arduino robot arm

The Arduino code enters the while loop of that number, and stays there until we touch up the button, because at that moment we send the number 0 which means the robot should do nothing.

// Move servo 1 in positive direction
    while (m == 16) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo01.write(servo1PPos);
      servo1PPos++;
      delay(speedDelay);
    }
    // Move servo 1 in negative direction
    while (m == 17) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo01.write(servo1PPos);
      servo1PPos--;
      delay(speedDelay);
    }

So, depending on the touched buttons the servos move either in positive or negative direction. The same working principle applies for all servo motors. For changing the speed of movement, we use the values coming from the slider which range from 100 to 250.

// If arm speed slider is changed
    if (dataIn > 101 & dataIn < 250) {
      speedDelay = dataIn / 10; // Change servo speed (delay time)
    }

By dividing them by 10 we get values from 10 to 25, which are used as delay in microseconds in the whiles loops for driving the servos.

For storing the robot movements, we simply save the current positions of the servos and the steppers into arrays, each time the Save button is clicked.

// If button "SAVE" is pressed
    if (m == 12) {
      //if it's initial save, set the steppers position to 0
      if (index == 0) {
        LeftBackWheel.setCurrentPosition(0);
        LeftFrontWheel.setCurrentPosition(0);
        RightBackWheel.setCurrentPosition(0);
        RightFrontWheel.setCurrentPosition(0);
      }
      lbw[index] = LeftBackWheel.currentPosition();  // save position into the array
      lfw[index] = LeftFrontWheel.currentPosition();
      rbw[index] = RightBackWheel.currentPosition();
      rfw[index] = RightFrontWheel.currentPosition();

      servo01SP[index] = servo1PPos;  // save position into the array
      servo02SP[index] = servo2PPos;
      servo03SP[index] = servo3PPos;
      servo04SP[index] = servo4PPos;
      servo05SP[index] = servo5PPos;
      servo06SP[index] = servo6PPos;
      index++;                        // Increase the array index
      m = 0;
    }

Then when we press the Run button we call the runSteps() custom function. This custom function runs through all stored steps using some for and while loops.

if (m == 14) {
      runSteps();

      // If button "RESET" is pressed
      if (dataIn != 14) {
        stopMoving();
        memset(lbw, 0, sizeof(lbw)); // Clear the array data to 0
        memset(lfw, 0, sizeof(lfw));
        memset(rbw, 0, sizeof(rbw));
        memset(rfw, 0, sizeof(rfw));
        memset(servo01SP, 0, sizeof(servo01SP)); // Clear the array data to 0
        memset(servo02SP, 0, sizeof(servo02SP));
        memset(servo03SP, 0, sizeof(servo03SP));
        memset(servo04SP, 0, sizeof(servo04SP));
        memset(servo05SP, 0, sizeof(servo05SP));
        memset(servo06SP, 0, sizeof(servo06SP));
        index = 0;  // Index to 0
      }
    }

We should note that it starts from the first position and goes the last position, and repeats that over and over again. Therefore, when saving the steps, we actually need to position the robot in a way that the first step has the same position as the last step. While running through the steps we can also change the speed of both the platform and the robot arm, as well as pause and reset all the steps.

Here’s the complete Arduino code for this Arduino robot project:

/*
       Arduino Robot Arm and Mecanum Wheels Robot
          Smartphone Control via Bluetooth
       by Dejan, www.HowToMechatronics.com
*/

#include <SoftwareSerial.h>
#include <AccelStepper.h>
#include <Servo.h>

Servo servo01;
Servo servo02;
Servo servo03;
Servo servo04;
Servo servo05;
Servo servo06;

SoftwareSerial Bluetooth(A8, 38); // Arduino(RX, TX) - HC-05 Bluetooth (TX, RX)

// Define the stepper motors and the pins the will use
AccelStepper LeftBackWheel(1, 42, 43);   // (Type:driver, STEP, DIR) - Stepper1
AccelStepper LeftFrontWheel(1, 40, 41);  // Stepper2
AccelStepper RightBackWheel(1, 44, 45);  // Stepper3
AccelStepper RightFrontWheel(1, 46, 47); // Stepper4

#define led 14

int wheelSpeed = 1500;

int lbw[50], lfw[50], rbw[50], rfw[50]; // arrays for storing positions/steps

int servo1Pos, servo2Pos, servo3Pos, servo4Pos, servo5Pos, servo6Pos; // current position
int servo1PPos, servo2PPos, servo3PPos, servo4PPos, servo5PPos, servo6PPos; // previous position
int servo01SP[50], servo02SP[50], servo03SP[50], servo04SP[50], servo05SP[50], servo06SP[50]; // for storing positions/steps
int speedDelay = 20;
int index = 0;
int dataIn;
int m = 0;

void setup() {
  // Set initial seed values for the steppers
  LeftFrontWheel.setMaxSpeed(3000);
  LeftBackWheel.setMaxSpeed(3000);
  RightFrontWheel.setMaxSpeed(3000);
  RightBackWheel.setMaxSpeed(3000);
  pinMode(led, OUTPUT);
  servo01.attach(5);
  servo02.attach(6);
  servo03.attach(7);
  servo04.attach(8);
  servo05.attach(9);
  servo06.attach(10);
  Bluetooth.begin(38400); // Default baud rate of the Bluetooth module
  Bluetooth.setTimeout(5);
  delay(20);
  Serial.begin(38400);
  // Move robot arm to initial position
  servo1PPos = 90;
  servo01.write(servo1PPos);
  servo2PPos = 100;
  servo02.write(servo2PPos);
  servo3PPos = 120;
  servo03.write(servo3PPos);
  servo4PPos = 95;
  servo04.write(servo4PPos);
  servo5PPos = 60;
  servo05.write(servo5PPos);
  servo6PPos = 110;
  servo06.write(servo6PPos);
}

void loop() {
  // Check for incoming data
  if (Bluetooth.available() > 0) {
    dataIn = Bluetooth.read();  // Read the data

    if (dataIn == 0) {
      m = 0;
    }
    if (dataIn == 1) {
      m = 1;
    }
    if (dataIn == 2) {
      m = 2;
    }
    if (dataIn == 3) {
      m = 3;
    }
    if (dataIn == 4) {
      m = 4;
    }
    if (dataIn == 5) {
      m = 5;
    }
    if (dataIn == 6) {
      m = 6;
    }
    if (dataIn == 7) {
      m = 7;
    }
    if (dataIn == 8) {
      m = 8;
    }
    if (dataIn == 9) {
      m = 9;
    }
    if (dataIn == 10) {
      m = 10;
    }
    if (dataIn == 11) {
      m = 11;
    }
    if (dataIn == 12) {
      m = 12;
    }
    if (dataIn == 14) {
      m = 14;
    }
    if (dataIn == 16) {
      m = 16;
    }
    if (dataIn == 17) {
      m = 17;
    }
    if (dataIn == 18) {
      m = 18;
    }
    if (dataIn == 19) {
      m = 19;
    }
    if (dataIn == 20) {
      m = 20;
    }
    if (dataIn == 21) {
      m = 21;
    }
    if (dataIn == 22) {
      m = 22;
    }
    if (dataIn == 23) {
      m = 23;
    }
    if (dataIn == 24) {
      m = 24;
    }
    if (dataIn == 25) {
      m = 25;
    }
    if (dataIn == 26) {
      m = 26;
    }
    if (dataIn == 27) {
      m = 27;
    }

    // Move the Mecanum wheels platform
    if (m == 4) {
      moveSidewaysLeft();
    }
    if (m == 5) {
      moveSidewaysRight();
    }
    if (m == 2) {
      moveForward();
    }
    if (m == 7) {
      moveBackward();
    }
    if (m == 3) {
      moveRightForward();
    }
    if (m == 1) {
      moveLeftForward();
    }
    if (m == 8) {
      moveRightBackward();
    }
    if (m == 6) {
      moveLeftBackward();
    }
    if (m == 9) {
      rotateLeft();
    }
    if (m == 10) {
      rotateRight();
    }

    if (m == 0) {
      stopMoving();
    }

    // Mecanum wheels speed
    if (dataIn > 30 & dataIn < 100) {
      wheelSpeed = dataIn * 20;
    }

    // Move robot arm
    // Move servo 1 in positive direction
    while (m == 16) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo01.write(servo1PPos);
      servo1PPos++;
      delay(speedDelay);
    }
    // Move servo 1 in negative direction
    while (m == 17) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo01.write(servo1PPos);
      servo1PPos--;
      delay(speedDelay);
    }
    // Move servo 2
    while (m == 19) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo02.write(servo2PPos);
      servo2PPos++;
      delay(speedDelay);
    }
    while (m == 18) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo02.write(servo2PPos);
      servo2PPos--;
      delay(speedDelay);
    }
    // Move servo 3
    while (m == 20) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo03.write(servo3PPos);
      servo3PPos++;
      delay(speedDelay);
    }
    while (m == 21) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo03.write(servo3PPos);
      servo3PPos--;
      delay(speedDelay);
    }
    // Move servo 4
    while (m == 23) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo04.write(servo4PPos);
      servo4PPos++;
      delay(speedDelay);
    }
    while (m == 22) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo04.write(servo4PPos);
      servo4PPos--;
      delay(speedDelay);
    }
    // Move servo 5
    while (m == 25) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo05.write(servo5PPos);
      servo5PPos++;
      delay(speedDelay);
    }
    while (m == 24) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo05.write(servo5PPos);
      servo5PPos--;
      delay(speedDelay);
    }
    // Move servo 6
    while (m == 26) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo06.write(servo6PPos);
      servo6PPos++;
      delay(speedDelay);
    }
    while (m == 27) {
      if (Bluetooth.available() > 0) {
        m = Bluetooth.read();
      }
      servo06.write(servo6PPos);
      servo6PPos--;
      delay(speedDelay);
    }

    // If arm speed slider is changed
    if (dataIn > 101 & dataIn < 250) {
      speedDelay = dataIn / 10; // Change servo speed (delay time)
    }

    // If button "SAVE" is pressed
    if (m == 12) {
      //if it's initial save, set the steppers position to 0
      if (index == 0) {
        LeftBackWheel.setCurrentPosition(0);
        LeftFrontWheel.setCurrentPosition(0);
        RightBackWheel.setCurrentPosition(0);
        RightFrontWheel.setCurrentPosition(0);
      }
      lbw[index] = LeftBackWheel.currentPosition();  // save position into the array
      lfw[index] = LeftFrontWheel.currentPosition();
      rbw[index] = RightBackWheel.currentPosition();
      rfw[index] = RightFrontWheel.currentPosition();

      servo01SP[index] = servo1PPos;  // save position into the array
      servo02SP[index] = servo2PPos;
      servo03SP[index] = servo3PPos;
      servo04SP[index] = servo4PPos;
      servo05SP[index] = servo5PPos;
      servo06SP[index] = servo6PPos;
      index++;                        // Increase the array index
      m = 0;
    }

    // If button "RUN" is pressed
    if (m == 14) {
      runSteps();

      // If button "RESET" is pressed
      if (dataIn != 14) {
        stopMoving();
        memset(lbw, 0, sizeof(lbw)); // Clear the array data to 0
        memset(lfw, 0, sizeof(lfw));
        memset(rbw, 0, sizeof(rbw));
        memset(rfw, 0, sizeof(rfw));
        memset(servo01SP, 0, sizeof(servo01SP)); // Clear the array data to 0
        memset(servo02SP, 0, sizeof(servo02SP));
        memset(servo03SP, 0, sizeof(servo03SP));
        memset(servo04SP, 0, sizeof(servo04SP));
        memset(servo05SP, 0, sizeof(servo05SP));
        memset(servo06SP, 0, sizeof(servo06SP));
        index = 0;  // Index to 0
      }
    }
  }
  LeftFrontWheel.runSpeed();
  LeftBackWheel.runSpeed();
  RightFrontWheel.runSpeed();
  RightBackWheel.runSpeed();

  // Monitor the battery voltage
  int sensorValue = analogRead(A0);
  float voltage = sensorValue * (5.0 / 1023.00) * 3; // Convert the reading values from 5v to suitable 12V i
  //Serial.println(voltage);
  // If voltage is below 11V turn on the LED
  if (voltage < 11) {
    digitalWrite(led, HIGH);
  }
  else {
    digitalWrite(led, LOW);
  }
}
void moveForward() {
  LeftFrontWheel.setSpeed(wheelSpeed);
  LeftBackWheel.setSpeed(wheelSpeed);
  RightFrontWheel.setSpeed(wheelSpeed);
  RightBackWheel.setSpeed(wheelSpeed);
}
void moveBackward() {
  LeftFrontWheel.setSpeed(-wheelSpeed);
  LeftBackWheel.setSpeed(-wheelSpeed);
  RightFrontWheel.setSpeed(-wheelSpeed);
  RightBackWheel.setSpeed(-wheelSpeed);
}
void moveSidewaysRight() {
  LeftFrontWheel.setSpeed(wheelSpeed);
  LeftBackWheel.setSpeed(-wheelSpeed);
  RightFrontWheel.setSpeed(-wheelSpeed);
  RightBackWheel.setSpeed(wheelSpeed);
}
void moveSidewaysLeft() {
  LeftFrontWheel.setSpeed(-wheelSpeed);
  LeftBackWheel.setSpeed(wheelSpeed);
  RightFrontWheel.setSpeed(wheelSpeed);
  RightBackWheel.setSpeed(-wheelSpeed);
}
void rotateLeft() {
  LeftFrontWheel.setSpeed(-wheelSpeed);
  LeftBackWheel.setSpeed(-wheelSpeed);
  RightFrontWheel.setSpeed(wheelSpeed);
  RightBackWheel.setSpeed(wheelSpeed);
}
void rotateRight() {
  LeftFrontWheel.setSpeed(wheelSpeed);
  LeftBackWheel.setSpeed(wheelSpeed);
  RightFrontWheel.setSpeed(-wheelSpeed);
  RightBackWheel.setSpeed(-wheelSpeed);
}
void moveRightForward() {
  LeftFrontWheel.setSpeed(wheelSpeed);
  LeftBackWheel.setSpeed(0);
  RightFrontWheel.setSpeed(0);
  RightBackWheel.setSpeed(wheelSpeed);
}
void moveRightBackward() {
  LeftFrontWheel.setSpeed(0);
  LeftBackWheel.setSpeed(-wheelSpeed);
  RightFrontWheel.setSpeed(-wheelSpeed);
  RightBackWheel.setSpeed(0);
}
void moveLeftForward() {
  LeftFrontWheel.setSpeed(0);
  LeftBackWheel.setSpeed(wheelSpeed);
  RightFrontWheel.setSpeed(wheelSpeed);
  RightBackWheel.setSpeed(0);
}
void moveLeftBackward() {
  LeftFrontWheel.setSpeed(-wheelSpeed);
  LeftBackWheel.setSpeed(0);
  RightFrontWheel.setSpeed(0);
  RightBackWheel.setSpeed(-wheelSpeed);
}
void stopMoving() {
  LeftFrontWheel.setSpeed(0);
  LeftBackWheel.setSpeed(0);
  RightFrontWheel.setSpeed(0);
  RightBackWheel.setSpeed(0);
}

// Automatic mode custom function - run the saved steps
void runSteps() {
  while (dataIn != 13) {   // Run the steps over and over again until "RESET" button is pressed
    for (int i = 0; i <= index - 2; i++) {  // Run through all steps(index)
      if (Bluetooth.available() > 0) {      // Check for incomding data
        dataIn = Bluetooth.read();
        if ( dataIn == 15) {           // If button "PAUSE" is pressed
          while (dataIn != 14) {         // Wait until "RUN" is pressed again
            if (Bluetooth.available() > 0) {
              dataIn = Bluetooth.read();
              if ( dataIn == 13) {
                break;
              }
            }
          }
        }
        // If speed slider is changed
        if (dataIn > 100 & dataIn < 150) {
          speedDelay = dataIn / 10; // Change servo speed (delay time)
        }
        // Mecanum wheels speed
        if (dataIn > 30 & dataIn < 100) {
          wheelSpeed = dataIn * 10;
          dataIn = 14;
        }
      }
      LeftFrontWheel.moveTo(lfw[i]);
      LeftFrontWheel.setSpeed(wheelSpeed);
      LeftBackWheel.moveTo(lbw[i]);
      LeftBackWheel.setSpeed(wheelSpeed);
      RightFrontWheel.moveTo(rfw[i]);
      RightFrontWheel.setSpeed(wheelSpeed);
      RightBackWheel.moveTo(rbw[i]);
      RightBackWheel.setSpeed(wheelSpeed);

      while (LeftBackWheel.currentPosition() != lbw[i] & LeftFrontWheel.currentPosition() != lfw[i] & RightFrontWheel.currentPosition() != rfw[i] & RightBackWheel.currentPosition() != rbw[i]) {
        LeftFrontWheel.runSpeedToPosition();
        LeftBackWheel.runSpeedToPosition();
        RightFrontWheel.runSpeedToPosition();
        RightBackWheel.runSpeedToPosition();
      }
      // Servo 1
      if (servo01SP[i] == servo01SP[i + 1]) {
      }
      if (servo01SP[i] > servo01SP[i + 1]) {
        for ( int j = servo01SP[i]; j >= servo01SP[i + 1]; j--) {
          servo01.write(j);
          delay(speedDelay);
        }
      }
      if (servo01SP[i] < servo01SP[i + 1]) {
        for ( int j = servo01SP[i]; j <= servo01SP[i + 1]; j++) {
          servo01.write(j);
          delay(speedDelay);
        }
      }

      // Servo 2
      if (servo02SP[i] == servo02SP[i + 1]) {
      }
      if (servo02SP[i] > servo02SP[i + 1]) {
        for ( int j = servo02SP[i]; j >= servo02SP[i + 1]; j--) {
          servo02.write(j);
          delay(speedDelay);
        }
      }
      if (servo02SP[i] < servo02SP[i + 1]) {
        for ( int j = servo02SP[i]; j <= servo02SP[i + 1]; j++) {
          servo02.write(j);
          delay(speedDelay);
        }
      }

      // Servo 3
      if (servo03SP[i] == servo03SP[i + 1]) {
      }
      if (servo03SP[i] > servo03SP[i + 1]) {
        for ( int j = servo03SP[i]; j >= servo03SP[i + 1]; j--) {
          servo03.write(j);
          delay(speedDelay);
        }
      }
      if (servo03SP[i] < servo03SP[i + 1]) {
        for ( int j = servo03SP[i]; j <= servo03SP[i + 1]; j++) {
          servo03.write(j);
          delay(speedDelay);
        }
      }

      // Servo 4
      if (servo04SP[i] == servo04SP[i + 1]) {
      }
      if (servo04SP[i] > servo04SP[i + 1]) {
        for ( int j = servo04SP[i]; j >= servo04SP[i + 1]; j--) {
          servo04.write(j);
          delay(speedDelay);
        }
      }
      if (servo04SP[i] < servo04SP[i + 1]) {
        for ( int j = servo04SP[i]; j <= servo04SP[i + 1]; j++) {
          servo04.write(j);
          delay(speedDelay);
        }
      }

      // Servo 5
      if (servo05SP[i] == servo05SP[i + 1]) {
      }
      if (servo05SP[i] > servo05SP[i + 1]) {
        for ( int j = servo05SP[i]; j >= servo05SP[i + 1]; j--) {
          servo05.write(j);
          delay(speedDelay);
        }
      }
      if (servo05SP[i] < servo05SP[i + 1]) {
        for ( int j = servo05SP[i]; j <= servo05SP[i + 1]; j++) {
          servo05.write(j);
          delay(speedDelay);
        }
      }

      // Servo 6
      if (servo06SP[i] == servo06SP[i + 1]) {
      }
      if (servo06SP[i] > servo06SP[i + 1]) {
        for ( int j = servo06SP[i]; j >= servo06SP[i + 1]; j--) {
          servo06.write(j);
          delay(speedDelay);
        }
      }
      if (servo06SP[i] < servo06SP[i + 1]) {
        for ( int j = servo06SP[i]; j <= servo06SP[i + 1]; j++) {
          servo06.write(j);
          delay(speedDelay);
        }
      }
    }
  }
}

So that’s pretty much everything for this tutorial. The project works well, but please note that it’s far from perfect. The automatic movements might not be that precise because of the slipping of the mecanum wheels as well as the poor performance of the servo motors. These cheap servo motors can also shake or jitter even when not moving just because don’t have enough strength to hold the weight of the 3D printed parts.

I hope you enjoyed this tutorial and learned something new. Feel free to ask any question in the comments section below and check my Arduino Projects Collection.

The post Arduino Robot Arm and Mecanum Wheels Platform Automatic Operation appeared first on HowToMechatronics.

from HowToMechatronics http://bit.ly/2RSS0RW
via IFTTT

Honda’s e Prototype is designed to delight you

Standard

There’s only one word that accurately describes the Honda e: adorable. The pop-out door handles. The compact wing mirrors that are actually tiny cameras. The simple, oval-shaped front that houses the Honda badge and two bright, circular headlamps. From every angle, it’s just freakin’ cute.

And I think that’s by design. The tiny hatchback is clearly built for urbanites who want something small, nippy and, by extension, joyful to drive. It will ship with a 35.5 kWh Lithium-ion battery, which can manage — at least by contemporary EV standards — a rather pitiful 200KM (roughly 124 miles) on a single charge. That’s substantially less than many of its rivals, including the new Renault Zoe (242 miles) and Nissan Leaf e+ (226 miles). But the Honda e’s conservative range probably doesn’t matter. Most people city dwellers, after all, rarely drive more than 200 miles per day.

Honda is banking on something else to sell its first fully-electric car: joy. The runaround is rear-wheel drive, for instance, which gives the car some "sporty character," according to Honda, and a ridiculously small turning circle of 4.3 metres. That figure will mean little in most US cities, which are known for their huge parking lots and wide, open roads. But in Europe — a continent filled with tight streets and back alleys — that maneuverability will be both practical and grin-inducing. The car has a perfectly even weight distribution, too, which should keep the handling razor sharp.

And then there’s the technology. The Honda e has five — yes, five — screens to peer at inside. The outer two show what the aforementioned side cameras are seeing. Another, tucked behind the steering wheel, will show the current speed, remaining charge, and other information that’s critical for the driver. The final two, positioned roughly in the center, will be for navigation and general infotainment. They all sit on top of a wood-like material (I’m pretty sure it’s a veneer) that looks purposefully old-fashioned. It’s a strange, but effective blend of futurism and Honda brand nostalgia that I can’t help but love.

As the car cruised past, I couldn’t help but smile and wave.

At the Goodwood Festival of Speed, I watched the Honda e take on the world-famous hillclimb. It was, to be perfectly honest, one of the slowest drives I’ve ever seen at the show. So slow, in fact, that both the driver and passenger were able to roll down their windows and wave at the crowd. I’m sure the car can move a little quicker — reporters have guest-imated that the can go from nought to 60MPH (roughly 97KMH) in eight seconds. But it didn’t really matter. As the car cruised past, I couldn’t help but smile and wave back at them, just like every good Brit does when a child or family waves from a passing boat or train.

Honda e Prototype

Honda’s compact EV should be no slouch to charge, either. With a CCS2 DC rapid charger (the car also accepts a Type 2 AC connector) Honda says you can get back 80 percent of the car’s maximum range (so roughly 160KM) in half an hour. Again, it’s the sort of figure that Honda hopes will delight you.

For now, the car is just a prototype. The company says that little should change in the production version, though, which will be available to order later this year in five colors, including a striking metallic yellow. The success of the Honda e will depend, inevitably, on its yet-to-be-announced price tag. The car’s range will be a disappointment to some and a deal-breaker for others. If the hatchback is priced low enough, however, it could tempt city slickers who, like me, have already fallen in love with its looks.

from Engadget https://engt.co/2YA5swO
via IFTTT

Architectural designs that focus on humans and nature alike!

Standard

Vertical Gardens, urban farms, sustainable housing are the terms raging this year. And they should be the rage! Climate change and global warming are afflicting our planet this very minute and every step we take in helping combat this issue, it needs to be taken right away. These architects have found a way to do their bit for the world. These buildings focus on creating a greener space that pays as much attention to humans residing in them as to their plant counterparts.  Check out our collection of eco-friendly products that will help you do your bit in saving the planet.

PARK ROYAL on Pickering Hotel by Woha Architects 

Shilda winery in Kakheti, Georgia by X-Architecture 

The Rebel Residence designed by StudioninedotsDelva 

Off The Grid Office by Stefan Mantu 

The Trudo Vertical Forest in Eindhoven, Netherlands, comes with 125 housing units where each apartment will have a surface area of under 50 sq.m. and the exclusive benefit of 1 tree, 20 shrubs, and over 4 sq.m. of terrace space by Stefano Boeri Architetti for Sint-Trudo

Planar House by Studio MK27 – Marcio Kogan + Lair Reis in Porto Feliz, Brazil 

Bert, a conceptual modular treehouse shaped like a tree trunk, with large round windows designed to make it look like the single-eyed character from the film Minions by Studio Precht 

Bamboo nest smart-towers for the future of Paris by Vincent Callebaut 

L’Oasis D’Aboukir (the Oasis of Aboukir) is a 25-meter-high green wall by botanist and researcher Patrick Blanc 

The landscaped A-Frames on the facade of our Hilton Hotel In Hyderabad by Precht 

BIONIC ARCH, A Vertical Forest for the Taichung City Hall by Vincent Callebaut Architectures

from Yanko Design http://bit.ly/2XlBik6
via IFTTT

IP strategy: How should startups decide whether to file patents

Standard
Bryant Lee, Ed Steakley & Saleh Kaihani
Contributor
Bryant Lee is the co-founder and managing partner of Cognition IP, a YC-backed IP law firm for startups. Ed Steakley was the Head of IP at Airware and Senior Patent Counsel at Apple. Saleh Kaihani is a partner at Cognition IP.

Deciding what to patent can be a confusing process but by creating a formal process it is something that every startup can manage.

Intellectual property (IP) is one of the most valuable assets of a startup and patents are often chief among IP in terms of value. Patents allow the startup to prevent competitors from using their technology, which is a powerful feature that can grant unique advantages in the marketplace.

From a business perspective, patents can help with driving investment and acquisitions, provide protection during partnerships and business deals, and help defend itself against patent lawsuits by others.

However, startups also often have a hard time determining when and what to patent. Innovative startups are inventing new things on a regular basis, and there is a danger of slipping into a haphazard approach of patenting whatever happens to be available rather than systematically analyzing the business needs of the company and protecting the IP that moves the needle the most.

Moreover, startups must balance the need to protect IP with other areas of the business: Patents are complex documents that require an investment of time and resources to obtain. They often require specialized legal counsel to write and a lengthy examination process at the U.S. Patent & Trademark Office (USPTO).

This article is a how-to guide for startups to make the decision on when and what to patent with a mature approach to IP strategy.

Table of Contents

Creating a regular IP harvesting process

Index 02

In order to make a decision about what to patent, a startup must first know what IP it has. For very small teams, it may be possible for everyone to have a shared idea of the IP. However, once teams grow beyond a few people, it is no longer possible to have complete visibility into what everyone on the team is doing and potentially inventing. Therefore, a regular IP harvesting process must be put in place to ensure proper reporting of IP to the executive level.

Most startups are best served with a simple IP harvesting process involving just three steps: (1) disclosure (2) invention review and (3) patent filing. In the disclosure stage, employees who are in IP creation roles must be trained to disclose ideas that are potentially protectable IP.

from TechCrunch https://tcrn.ch/2LzXUX6
via IFTTT

Explore the visual wonders of TouchDesigner: Summit, free workshop video

Standard

Few software tools have proved as expressive in generative visuals or audiovisual performance as TouchDesigner. Get introduced to its AV powers in a new, free video – or if you can make it Montreal, get the full experience live.

TouchDesigner is a dataflow tool – a graphical, patchable development environment – uniquely suited to squeezing gorgeous eye candy out of your computer graphics card. It’s also special for being musical and modular. It’s pretty enough that I’ve seen its actual zoomable UI displayed as art in performances, but whether or not you share that with the audience, it’s a kind of digital, graphical counterpart to the renewed love of cables and patching in sound.

Russian-born, Berlin-based Stanislav Glazov has gone deep into that world both as a teacher and as an artist. (You can catch his visuals this week as part of the UY ZONE, a fashion-meets-performance immersive environment inside Berghain in Berlin, or as a solo artist or working with techno legend Dasha Rush around Europe and Russia.)

Stas is happy to help you decipher the mysterious arts of TouchDesigner work yourself in his online workshop series. But you’ll probably want to start at the beginning – or even if you have some TouchDesigner background, better understand Stas’ take on it. Over the weekend, he led a free online workshop, and now you can watch at your leisure on YouTube:

If that taste has you excited, though, you might want to think about being in Montréal in August – timed perfectly with the massive MUTEK festival.

It’s not the first time there’s been an event around this tool, but this is surely the biggest. The day program alone features:

  • 350 participants
  • 69 presenters
  • 45 workshops
  • 21 talks

And that’s all in 3 days, packed onto the Coeur des Sciences / UQAM campus. The organizers describe it as “an intensive forum and stimulating meeting ground for the TouchDesigner community to share knowledge and experiences, learn new skills, connect in person with your favorite TD mentors and peeps and make a lot of new friends and collaborators.”

The night program promises still more, with an “after dark” social program, with 404.zero, ELEMAUN / Ali Phi, our friend Procedural, and Woulg.

TouchDesigner for its part has been expanding with lots of new features, including a specialized module for performing with lasers. That in turn is being used in the incredible collaboration of Robert Henke and Christopher Bauder – hope to cover that more soon:

Full details:

https://2019.touchdesignersummit.com/

And to check out Stas’ paid video courses:

https://lichtpfad.selz.com/

Image at top – deadmau5, prepping as his live show is built in TouchDesigner. Find lots more inspiration like this on the blog – I could page through that all day:

https://www.derivative.ca/Blog/

The post Explore the visual wonders of TouchDesigner: Summit, free workshop video appeared first on CDM Create Digital Music.

from Create Digital Music http://bit.ly/30enQvv
via IFTTT

A NASA satellite caught yesterday’s solar eclipse and a Category 4 hurricane at the same time — here’s the video

Standard

hurricane eclipse skitch

A solar eclipse passed near a powerful hurricane on Tuesday, and a satellite in space saw the whole thing.

The total eclipse — when the moon completely blocks the sun’s light — was visible in parts of Chile and Argentina, as well as over the Pacific Ocean, starting at 4:39 p.m. local time.

The National Weather Service tweeted a video from the GOES satellite system, which is run jointly by NASA and the National Oceanic and Atmospheric Administration (NOAA). The clip shows the moon’s shadow passing over the South Pacific – and right past a hurricane.

In the footage, the moon’s shadow is seen just south of Hurricane Barbara, which is the second hurricane to arrive in the 2019 Pacific hurricane season. The moon’s shadow, of course, is not usually visible on Earth — this only happens when the moon comes in between our planet and the sun, blocking the sun’s light from hitting a swath of the Earth. That’s an eclipse.

The lighter shadow around that dark core comes from a partial solar eclipse — when the moon only blocks out part of the sun.

solar eclipse graphic how a total solar eclipse works

The solar eclipse was the only one this year and the first one since August 2017, when the moon’s shadow followed a northwest-to-southeast path of totality across the United States. 

Hurricane Barbara grew into a Category 4 storm on Monday. Categories are determined by a storm’s sustained wind speeds, so Category 4 means winds of 130-156 mph (210-250 kph).

The National Hurricane Center does not expect it to approach land, however. Barbara is currently more than 1,200 miles southwest of the southern tip of Baja California, and moving west and into open waters. 

 

 

SEE ALSO: The wild story behind how a photographer snapped one of the most amazing eclipse photos ever taken

Join the conversation about this story »

NOW WATCH: An animated map shows every total solar eclipse around the world until 2040

from SAI http://bit.ly/2J9xXMq
via IFTTT

Social media is revolutionizing how scientists interact with the public

Standard

The field of science communication — the practice of informing and educating people about science-related topics — arose just after the start of the Enlightenment when Francesco Algarotti published his first edition of Newtonianism for the Ladies in 1737. While that bit of 18th century mansplaining doesn’t really hold up by today’s standards, in the nearly three centuries since, the pace of scientific progress has only accelerated — with science communication evolving alongside it. The advent of social media, in particular, is an unprecedented, powerful tool for science communicators.

"It was right after the election and I noticed that there was all this energy in the community, thinking about how we could better communicate our science to the public," University of Connecticut PhD student Sarah McAnulty told Engadget. "I thought we needed some way to engage scientists, in a low time-commitment, high-impact, kind of way."

The result is Skype a Scientist. Launched in 2017, it connects researchers from a broad range of fields with students, teachers and other interested groups via, well, Skype. Each meeting lasts 30 minutes to an hour and operates as an informal Q&A session.

"Typically it is structured as question and answer sessions, because we want people to feel as though they’ve really met a scientist, not just got lectured," McAnulty continued. "We want people to get answers to what they actually want to know about. That’s really important."

The operation itself is fairly straightforward. Teachers and interested parties fill out a Google form with their schedule availability while researchers and scientists fill out a similar form of their own. Then, a sorting algorithm designed by bioinformatician David Jenkins, a PhD student at Boston University, matches up the two groups for a session. "It’s free," McAnulty points out. "As long as you have an internet connection, you’re good to go."

Before the advent of the internet, this sort of interaction simply wouldn’t be feasible. Similar programs do exist, such as Letters to a Pre-Scientist, but nothing on this scale. In the last two and a half years, Skype a Scientist has served 15,000 classrooms and signed up 6,000 individual researchers to participate.

"I basically did this whole thing via Twitter, I tweeted about it," McAnulty said. "And then the word of mouth spread extraordinarily quickly. Without that social media aspect of scientists talking to each other on Twitter, I can’t imagine I would have gotten this many teachers or scientists."

Before Skype a Scientist, McAnulty launched the Squid Scientists Tumblr page in 2014. "Originally, it was just I wanted to see what what if it was possible because Tumblr, generally speaking, wasn’t a place where science communication was happening too much." Still, McAnulty found Tumblr to be less hostile to women than Reddit and that it skewed towards further a younger audience than Twitter.

"I get more questions from Tumblr from young women who are thinking about being a scientist or just want to know more before they make a choice about what kind of careers they think they could see themselves in," she said. "So Tumblr has been really powerful for that."

Indeed, the elimination of communication barriers and the waning influence of traditional "gatekeepers" to the scientific community has enabled female, PoC, LGBTQ+, and non-binary researches a direct line to an interested public. And given that a 2018 study found that only around 30 percent of studies published in the Nature Index journals were penned by female researchers, that ability to connect with not just the public but other researchers as well, could help reduce that discrepancy.

McAnulty notes that mainstream science media outlets like the Discovery Channel or NatGeo will cast their scientist hosts based on who will return the best ratings. "In the process, they are choosing scientists that they think people will view as scientists," she said, "It’s a positive feedback loop of sexism."

However, with the rise of social media, especially Twitter, Instagram and YouTube, researchers from underrepresented groups don’t have to wait for NatGeo to come knocking. They can produce their own content, cultivate their own audiences and share their passion for science directly. "The more that we’re engaging with the public — and even engaging in our own communities — the more representation you have of everybody, the better and the stronger our scientific community will be," McAnulty said.

The podcasting community has also become a hotbed for science communication. Take This Week In Science, for example. Originally a live radio show broadcast from KDVS on the University of California, Davis campus, it now reaches listeners in 60 countries as a weekly podcast. Neurophysiologist and science communicator, Dr. Kirsten "Kiki" Sanford, founded the show in 2000.

"I was a graduate student when I started it and was really interested in the idea of talking with people about the stuff I was learning," she explained to Engadget. "I would hang out with my neighbors and we would talk about things that we had learned recently, things that were cutting edge research, and just how exciting they were."

She quickly realized that there wasn’t much of that sort of content available. "The only radio show at the time in the area that I lived, was Science Friday, which was great, but that was it," Sanford said. "And so we approached the local college radio station to see if they wanted to have a science show."

In the 19 years (and 500-plus episodes) since, TWIS has held a number of live tapings at local clubs and science festivals. "I enjoy doing live shows, because there’s that instant feedback," Sanford said. "You can see people’s faces, whether or not they’re engaged in what you’re talking about whether or not they’re bored. I can up-regulate what I’m saying, I can shift the way that I’m explaining it, I can ask the audience a question and you know, get a show of hands or get a response right then and there."

Sanford and her team are expanding into other areas of social media, such as their recently-launched monthly newsletter. "I’d like to be able to get the show to stable financial basis, where we can put more time into doing shorter content for YouTube, or maybe a daily show" Sanford continued. "One fun idea that have been bounced around recently: I have an eight year old son and he’s getting interested in [science]. So we’ve been talking about having a Twitch Junior program."

These sorts of conversations wouldn’t have occurred without the rise of these platforms. "With the access that people have, especially social media, I am seeing so many more scientists, talking to people not just to each other, but to people who are just like, ‘Oh, what is this thing you study?” Sanford noted. "And suddenly there’s a conversation happening. That didn’t happen before."

Science communication is having an outsized effect on the scientific job market as well, Sanford points out, with people carving out careers in a field that didn’t exist a decade ago. "You had science writers, you had science journalists, but to the idea of a science communicator?" Sanford quipped, "Now people are calling themselves science communicators all over the place. It’s amazing."

Though social media’s open access regularly serves as a double-edged sword, with conspiracy theorists intentionally spreading misinformation online, both McAnulty and Sanford remain optimistic that the scientific community will be able to minimize the damage those bad-faith actors might cause.

"That’s social media’s equality, and that is a blessing and a curse," McAnulty said. "I guess one of the goals for my science communication, and my career, is to help people connect with sources of information that they can trust."

from Engadget https://engt.co/2Xvqu3e
via IFTTT

I made my own digital camera using an Arduino, a projector and a photoresistor

Standard

The Flying Pixel Portrait Camera uses a video beamer, a single photoresistor, an Arduino and a PC for taking photos of people’s faces. The beamer ‘scans’ the image by projecting a small white square onto a person’s face inside an otherwise completely dark chamber. While the projected square slowly moves over the entire face, the photoresistor captures the reflected luminosities.

This generates a proportional analog electric signal which is digitized by an Arduino and transmitted to the PC. As the PC also controls the position of the projected square, it can now construct an image based on the different brightness values that it receives, one pixel at a time.

The scanning speed for the pictures which are shown above was rather slow. The speed is limited by the framerate of the projector, as only one pixel can be projected and thus can be captured at a time. All the faces are scanned with 30 pixels / second and since each image is 50 * 50 pixels large, it took 83 seconds to take one photo.

The setup was built out of recycled cardboard boxes, which I found in the cellar of the art school.

Instead of projecting a white pixel, one can also project a red, a green and a blue pixel one after another. This way, it is possible to scan RGB color images. Of course, a color scan is three times slower than a monochromatic scan – and the resulting image is also a lot more noisy, as a colored pixel is less bright than a white pixel. In fact, the blue channel of the color images that I made is mainly noise and contains almost no usable signal, because the LDR is not very sensitive to blue light.

The image quality suffered a lot when I tried to take color photos with the same setup.

Technical implementation

The diagram below shows the hardware setup of this installation. The program running on the computer is written in Processing and you can find it here. The Arduino is flashed with the Firmata firmware. This handy firmware gives direct access to the Arduino-pins via the Processing sketch. This way, the computer can read the analog Voltages from the photoresistor/voltage divider without writing an extra Arduino code.

Inspiration for the project

I have to admit that I didn’t come up with this concept. Some clever minds had the idea for scanning images this way already roughly 100 years ago. The technique is called ‘flying-spot scanning’ and it was used in the early days of mechanical television. I just thought it would be interesting to recreate the setup with today’s technologies. Instead of illuminating faces with an arc light shining through a spinning Nipkow disk, I used a beamer. And instead of transmitting the images to a ‘televisor’ (this is how mechanical TV receivers were called back in the days), I recorded the results of the scans as still images on my computer.

A drawing from “Radio News”, April 1928, which I found on Wikipedia. On the top left you can see the image scanning with a spinning Nipkow disk, an arc light and photocells.

Scanning images with an arc light, a motorized spinning disk with a few holes punched into it and a photocell is quite low tech. However, nowadays it seems much simpler to quickly hook up a beamer to a computer and a photoresistor to an Arduino in order to achieve a similar result. Nevertheless, the results of early mechanical TV scanning were quite stunning and definitely outperformed my setup when it comes to speed: The mechanical image scanners of the 1920s were able to scan several images per second. They could actually transmit moving images – while my camera can only record still images of subjects which don’t move for 1½ minutes.

Background & thanks

I made this experiment during a workshop that I gave at HEAD – the art and design school of Geneva as part of the Master of Arts in Media Design program. The theme of the workshop was “Advanced Selfie Machines” and you can find more results of this workshop here.

Thanks to Alexia Mathieu and the whole team of HEAD for the nice time in Geneva and for inviting me to give this workshop! And thanks to Raphaelle Mueller for the additional photos!

Additional resources

About the Author

Niklas Roy is an installation artist and educator living in Berlin. Through his work, he explores art, science and technology, often in the form of humorous installations and machines. He likes to make as much as possible himself as it produces ideas which inspire his future projects. You can find out more about Niklas and his projects on his website and follow him on Twitter. This article was also published here and shared with permission.

from DIYPhotography.net -Hacking Photography, One Picture At A Time http://bit.ly/2NGCBpg
via IFTTT