Skip to main content

To Do, or Not To Do

remote control

Interacting with Computer Systems

The buttons you used on the brain are the beginning of a basic User Interface (UI). A UI is a space that allows the user to interact with a computer system (or machine). When you programmed the buttons on the brain, you gave users a way to interact with the Clawbot so they could raise and lower the arm. There are other types of User Interfaces (UIs), including Graphical User Interfaces (GUIs) such as touchscreens in cars and on smartphones. When you interact with a touchscreen on one of your devices (tablet, smartphone, smartwatch), those screens are often the only interface you have. Maybe your device has volume or power buttons as well but you mainly interact with the screen.

There are buttons on a TV remote that are programmed to turn the TV off or to turn the volume up when they are pushed. Some examples of UIs include the buttons on a video game controller or the buttons on a microwave. The ways these User Interfaces are designed depend on how the device works and how users interact with it.

Those design principles form the foundation of the User Experience (UX) while using a UI. The User Experience is how well the interface lets me, as the user, do what I'm trying to do. Is the interface working as I expect it to? Is it responsive to what I'm trying to communicate with my presses? Is it organized well, or should future versions of the UI move the buttons around to make it easier? What does the interface look like in general? Is it pleasing to look at and does it make me want to use it more often?

When a UI is still being developed and undergoing iterations, the developers collect data on what works as planned and what needs to be fixed or enhanced. That data then informs the next round of iterative design. Some of the UX changes recommended occur before the release of the device. But, the device might also be sold as is and those changes are made later before the next version is offered to the public consumer.