Adam Conkey

Adam Conkey

Senior Software Engineer

Berkshire Grey

Biography

I received my Ph.D. in Computing and Robotics from the University of Utah under the supervision of Dr. Tucker Hermans. My dissertation investigated skill planning under state and goal uncertainty for robot manipulation tasks.

I have hands-on experience with a variety of robot platforms, sensors, and simulators. I am passionate about open source software development for robotics, particularly for the ROS framework and Linux.

I am currently a Senior Software Engineer on the Applications team at Berkshire Grey.

In my spare time, I am playing around with learning Rust. I enjoy reading, playing guitar, and doing crossword puzzles. When I’m not injured, you can find me out trail running and cycling.

Interests
  • Robotics
  • Artificial Intelligence
  • Deep Learning
  • Multisensory Perception
  • Reinforcement Learning
  • Embodied Cognition
Education
  • PhD Computing and Robotics, 2022

    University of Utah

  • MS Computer Science, 2016

    DePaul University

  • BS Mathematics and Philosophy, 2011

    Carnegie Mellon University

Experience

 
 
 
 
 
Berkshire Grey
Senior Software Engineer
Berkshire Grey
Jan 2024 – Present Boston, MA
 
 
 
 
 
HRL Laboratories
Robotics Machine Learning Research Scientist
Aug 2022 – Jan 2024 Malibu, CA
I was technical lead and a core developer for a team creating new capabilities for undersea autonomous systems. I worked at the intersection of robotics, machine learning, perception, and autonomous decision-making to deliver autonomy capabilities to client platforms.
 
 
 
 
 
Utah Learning Lab for Manipulation Autonomy (LL4MA)
Research Assistant
Aug 2016 – Aug 2022 Salt Lake City, UT
I worked under the supervision of Dr. Tucker Hermans to complete research for my PhD dissertation regarding skill planning under state and goal uncertainty for robot manipulation tasks.
 
 
 
 
 
Amazon Robotics
Research Scientist Intern - Advanced Robotics
May 2019 – Aug 2019 North Reading, MA
I implemented an end-to-end robotic system for performing a material handling task using ROS and C++. I additionally developed a Python application from scratch for performing discrete event simulation, visualization, and timing analysis.
 
 
 
 
 
Accenture
Associate Software Engineer
Apr 2015 – Jul 2016 Austin, TX
I was an Agile Scrum developer in the Health and Public Service division of Accenture. I created new features for state healthcare systems ranging from frontend interfaces to backend batch operations and utilized technologies such as Java, Oracle SQL, and Oracle Policy Automation.
 
 
 
 
 
United States Patent and Trademark Office (USPTO)
Patent Examiner - Computer Science
May 2012 – Nov 2013 Alexandria, VA
I was a patent examiner in computer science in the art of compilers and software development tools. I determined the patentability of a claimed invention in view of discovered references and issued official decisions as to whether or not the patent was to be granted.
 
 
 
 
 
Student Researcher
Nov 2008 – May 2011 Pittsburgh, PA
I led an effort to improve the search efficiency of an automated theorem prover called AProS. I also explored syntactical abbreviations to make generated proofs more intelligible to humans, a key contribution to the project’s expansion into set theory and metamathematics.

Publications

Latent Space Planning for Multi-Object Manipulation with Environment-Aware Relational Classifiers
Transformer-based neural network that predicts object-environment relations for latent space planning of multi-object manipulation tasks.
Planning for Multi-Object Manipulation with Graph Neural Network Relational Classifiers
A graph neural network approach to multi-object manipulation from partial-view point clouds.
Planning under Uncertainty to Goal Distributions
A probabilistic planning framework for planning under uncertainty to goal distributions.
Representation Learning for Multisensory Perception and Planning
A high-level vision for learning robot behaviors in a lifelong learning setting leveraging multisensory observations.
Active Learning of Probabilistic Movement Primitives
An active learning approach to learning a library of Probabilistic Movement Primitives capable of generalizing over a bounded space.
Learning Task Constraints from Demonstration for Hybrid Force/Position Control
An apporach to learning a dynamic constraint frame from demonstration for hybrid force/position control using Cartesian Dynamic Movement Primitives.

Projects

IsaacGym Data Collection
Motion-planned manipulation behaviors using MoveIt and IsaacGym to collect data for learning algorithms.
Region Selector for Pointcloud Segmentation
Tool for rviz that enables user to select the region over which pointcloud segmentation will be performed.
Robot Control in Gazebo
Robot control framework for simulations in Gazebo. Designed to be extensible for easily adding new controllers. Works in conjunction with my robot interface package to decouple the controllers from any particular robot.
Robot Interface
Abstraction layer for controlling robots in a unified manner while accounting for robot platform-specific needs.
Real-Time Robot Control
Real-time Orocos controllers for the KUKA LBR4+ robot. Includes a controller switching framework for safely swapping controllers at runtime, and a simulated FRI component that allows Orocos components to be tested Gazebo.
ReFlex TakkTile in Gazebo Simulation
An actuated version of the ReFlex TakkTile hand and Gazebo plugins for simulating contact and pressure sensors on the fingers.
Simulator for Teleoperation
A lightweight simulator with force rendering created with rviz and DART. Provides a training ground for learning robust policies from demonstration with a haptic input device.
Baxter Demo Recorder
A demonstration recorder for the Baxter robot utilizing the button and display interfaces to make it easier to record demonstrations when operating Baxter in gravity compensation mode.
rviz Trajectory Visualization
Package for generating trajectory visualizations in rViz, including dynamic real-time visualizations showing a customizable trace of the robot’s end-effector.