Adam Conkey

Adam Conkey

PhD Candidate

University of Utah

LL4MA Lab

About Me

I am a PhD candidate in the School of Computing at the University of Utah in the robotics track of the Utah Robotics Center. I work in the Utah Learning Lab for Manipulation Autonomy (LL4MA) and am advised by Dr. Tucker Hermans. My current research focuses on applying deep learning to problems in multisensory perception to better understand how sensory observations correspond to task-level events. I am also interested in investigating new methods for integrating perception and action for robotic agents.

I have hands-on experience with a variety of robot platforms, sensors, and simulators. I am passionate about open source software development for robotics, particularly for the ROS framework and Linux. Please have a look at my resume and feel free to contact me if you would like to discuss future job opportunities. I expect to graduate Spring 2022.

In my free time I enjoy running in the mountains, cycling, cooking, reading, and playing guitar.

Interests
  • Robotics
  • Artificial Intelligence
  • Deep Learning
  • Multisensory Perception
  • Reinforcement Learning
  • Embodied Cognition
Education
  • PhD Computing and Robotics, 2021?

    University of Utah

  • MS Computer Science, 2016

    DePaul University

  • BS Mathematics and Philosophy, 2011

    Carnegie Mellon University

News

Date Description
December 2020 Passed my PhD proposal defense.
September 2020 Passed my PhD qualifier exam with a high-pass grade.
July 2020 Participated in the RSS Pioneers 2020 workshop.
August 2019 Our papers on active learning of ProMPs and learning DMP task constraints were accepted to Humanoids 2019.
Summer 2019 I will be interning at Amazon Robotics in North Reading, MA!

Experience

 
 
 
 
 
Research Scientist Intern - Advanced Robotics
May 2019 – Aug 2019 North Reading, MA
I implemented an end-to-end robotic system for performing a material handling task using ROS and C++. I additionally developed a Python application from scratch for performing discrete event simulation, visualization, and timing analysis.
 
 
 
 
 
Research Assistant
Utah Learning Lab for Manipulation Autonomy (LL4MA)
Aug 2016 – Present Salt Lake City, UT
 
 
 
 
 
Associate Software Engineer
Apr 2015 – Jul 2016 Austin, TX
I was an Agile Scrum developer in the Health and Public Service division of Accenture. I created new features for state healthcare systems ranging from frontend interfaces to backend batch operations and utilized technologies such as Java, Oracle SQL, and Oracle Policy Automation.
 
 
 
 
 
Patent Examiner - Computer Science
May 2012 – Nov 2013 Alexandria, VA
I was a patent examiner in computer science in the art of compilers and software development tools. I determined the patentability of a claimed invention in view of discovered references and issued official decisions as to whether or not the patent was to be granted.

Projects

IsaacGym Data Collection
Motion-planned manipulation behaviors using MoveIt and IsaacGym to collect data for learning algorithms.
Region Selector for Pointcloud Segmentation
Tool for rviz that enables user to select the region over which pointcloud segmentation will be performed.
Robot Control in Gazebo
Robot control framework for simulations in Gazebo. Designed to be extensible for easily adding new controllers. Works in conjunction with my robot interface package to decouple the controllers from any particular robot.
Robot Interface
Abstraction layer for controlling robots in a unified manner while accounting for robot platform-specific needs.
Real-Time Robot Control
Real-time Orocos controllers for the KUKA LBR4+ robot. Includes a controller switching framework for safely swapping controllers at runtime, and a simulated FRI component that allows Orocos components to be tested Gazebo.
ReFlex TakkTile in Gazebo Simulation
An actuated version of the ReFlex TakkTile hand and Gazebo plugins for simulating contact and pressure sensors on the fingers.
Simulator for Teleoperation
A lightweight simulator with force rendering created with rviz and DART. Provides a training ground for learning robust policies from demonstration with a haptic input device.
Baxter Demo Recorder
A demonstration recorder for the Baxter robot utilizing the button and display interfaces to make it easier to record demonstrations when operating Baxter in gravity compensation mode.
rviz Trajectory Visualization
Package for generating trajectory visualizations in rViz, including dynamic real-time visualizations showing a customizable trace of the robot’s end-effector.