Seongkook Heo

I am an Assistant Professor in the Department of Computer Science at the University of Virginia. My research interests span across various areas of human-computer interaction, with an emphasis on interaction techniques and technologies.

I was previously a postdocotral researcher in the DGP lab at the University of Toronto, where I worked with Prof. Daniel Wigdor. I received my Ph.D. in Computer Science at the KAIST in 2017, advised by Prof. Geehyuk Lee. I was also a research intern at industry research labs, including Samsung Advanced Institute of Technology, Microsoft Research (with Dr. Ken Hinckley), and Autodesk Research (with Dr. Tovi Grossman).

seongkook@virginia.edu | Rice Hall 524 | CV | Google scholar


Prospective students: I am always looking for enthusiastic students who are instersted in human-computer interaction. If you're interested, please email me with your CV and research interests.

Research

My research can be grouped into three categories: 1) expanding interaction bandwidth between humans and computers , 2) enhancing human-human communication through computers , and 3) designing interaction techniques for wearable computers.

Expanding Interaction Bandwidth Between Humans and Computers

Our interaction with mobile computers is usually made by a finger movement on a flat surface, delivering our intention as a 2D position information and displays information via a visual display. In the real world, our physical manipulation is rich and bidirectional. We skillfully make complex hand movements to efficiently and effectively express our intentions and feel the reaction through our touch. My research developed new technologies and interaction techniques to enable computers to sense and generate previously unused properties of physical manipulation in human-computer interaction, both for input and output.

Rich Input Techniques
...
FDSense

FDSense enables estimating the stiffness of end effectors touching the surface by calculating the relationship between the changes in touch area and force by time measured on a commodity force-sensing touchpad. We demonstrate the uses of stiffness-augmented touch interaction for expressive applications.

UIST '18
...
Pre-Touch Interaction

Our touch is in fact started before making a contact. We explored a set of interaction techniques that utilizes grip and hover information before, during, and after the contact.

CHI '16
...
MelodicTap

While hotkeys makes our interaction more efficient, learning to use them can be time-consuming and degrades performance. MelodicTap uses a series of multi-finger tapping gestures for invoking hotkeys, which helps people to learn hotkeys without performace drops.

OzCHI '16
...
Consecutive Distant Taps

We collected and analyzed touch gestures made on an iPad while using various applications and found an input space that is currently unused - consecutive distant taps. We show two interaction techniques that uses this complementary input space.

CHI '14
...
Multi-Touch Shear Force

While tangential force can add complementary input vocabulary to a touch interface, sensing tangential forces at multiple touch points is challenging. We developed a method that estimates tangential forces from subtle touch movements made by finger tip deformation.

CHI '13
...
LongPad

LongPad is a touchpad that uses sets of infrared LEDs and photodiodes that can measure shapes and distance of hand parts on or above the touch surface. With LongPad, we can have new hover-enriched touch gestures and prevent > 99% of accidental touches.

CHI '13
...
Force Gestures

Force Gestures are touch screen gestures augmented with the normal and tangential forces on the screen. We built a custom device to enable measuring the multidimensional forces and designed the new gesture set.

UIST '11
...
ForceDrag

ForceDrag introduces a new method of using normal force as a modifier of a touch gesture. This work investigates the use of modifiers with different number of levels and modifier selection techniques.

OzCHI '12
...
ForceTap

Some modern mobile devices can sense force from a touch screen, however, it was not available when this work was being conducted. We developed a sensor fusion technique that uses device acceleration to estimate the impact of a tapping gesture.

MobileHCI '11

Haptic Feedback Technologies
...
Thor's Hammer

Thor's Hammer is a new handheld haptic device that uses propeller propulsion to generate ungrounded 3-DOF force feedback without the need for physical grounding or heavy air compressors. It enables strong and continuous 3-DOF force feedback to be used in mobile AR/VR scenarios.

CHI '18
...
Compliance Illusion

We developed a haptic feedback method that generates an elastic movement illusion on a rigid surface with a tangential force sensor and a vibrotactile actuator. The illusive movement can be programmatically controlled and can deliver information through touch.

IEEE Trans. Haptics, 2017

Enhancing Human-Human Communication Through Computers

These days, many of our communications to other people are made through computers. We conducted research to understand how people communicate differently to others in different types of relationships, how people communicate and engage through livestreams, and develop new methods to infer relationship types and to support knowledge learning behaviors through livestreams.

...
StreamWiki

Knowledge sharing livestreams bring new learning experiences that is more engaging and interactive compared to traditional online learning methods. StreamWiki is a tool that allows the viewers to collaboratively generate archival documentation.

CSCW '18
...
Livestreaming in China

Live streaming is making a huge impact in China, with more than 200 million viewers each night. This work shares the motivations and experiences of the livestream vieweres in China as well as the insights we gained from the study.

CHI '18
...
Inferring Relationship Types

We developed a method that infers social relationship types among people in the same organization by analyzing communication patterns. Our method collects co-location data of people and instant messenger data and uses machine learning techniques to infer social relationship types.

CSCW '13

Interaction Techniques for Wearable Computers

Wearable computers, such as smart glasses and smartwatches, make people to access information in variety of scenarios that were not possible before. However, these computers suffer from usability issues coming from the limited input area. We devleoped new interfaces and interaction techniques for wearable computers.

...
No-Handed Smartwatch Interaction

Smartwatches require both hands to operate, which makes the use in situations where the user is actively performing a task with their hands difficult. This work explores alternate ways to interact with smartwatches other than using finger touch.

GI '17
...
Cross-device Text Entry

Entering text for smart glasses is challenging because it can be tiring to keep arms high to touch the device and voice input is not always appropriate. We developed a cross-device method that allows users to use their smartwatches to enter text to smart glasses.

ISS '17
...
SplitBoard

SplitBoard is a new smartwatch keyboard that splits the QWERTY keyboard layout into half and lets users to quickly switch between spilts with a flick gesture. Our study showed that the technique outperfoms other techniques both in stationary and walking scenarios.

CHI '15IwC, 2016

Other Projects

...
IrPen

IrPen is a 6-DOF pen device that uses IrCube technology for tracking, which uses a set of infrared LEDs on the pen and a set of photodiodes installed on the frame of the tablet device. We present several interaction techniques that utilizes the position and the orientation of the pen above the surface.

IEEE CG&A, 2014
...
IrCube

IrCube is an optical 6-DOF tracking method that is low-cost, small, and accurate. Photodiode sensors placed in the environment measure the light intensity emitted from 13 LEDs installed at a pointer device in different orientations. With the measurements, our system can estimate the position and the orientation of the pointer by solving an inverse problem.

UIST '11 Electronics Letters, 2011

Publications

Conference and Journal Papers

Seongkook Heo, Jaeyeon Lee, Daniel Wigdor
PseudoBend: Producing Haptic Illusions of Stretching, Bending, and Twisting Using Grain Vibrations
UIST 2019: ACM Symposium on User Interface Software and Technology.
Paper | Video
Devamardeep Hayatpur, Seongkook Heo, Haijun Xia, Wolfgang Stuerzlinger, Daniel Wigdor
Plane, Ray, and Point: Enabling Precise Spatial Manipulations with Shape Constraints
UIST 2019: ACM Symposium on User Interface Software and Technology.
Paper | Video
Sanghwa Hong, Eunseok Jung, Seongkook Heo, and Byungjoo Lee
FDSense: Estimating Young’s Modulus and Stiffness of End Effectors to Facilitate Kinetic Interaction on Touch Surfaces
UIST 2018: ACM Symposium on User Interface Software and Technology. 809-823
Paper | Video
Zhicong Lu, Seongkook Heo, and Daniel Wigdor
StreamWiki: Enabling Viewers of Knowledge Sharing Live Streams to Collaboratively Generate Archival Documentation for Effective In-Stream and Post-Hoc Learning.
CSCW 2018: ACM Conference on Computer-Supported Cooperative Work and Social Computing. 26 pages.
Paper
Seongkook Heo, Christina Chung, Geehyuk Lee, Daniel Wigdor
Thor's Hammer: An Ungrounded Force Feedback Device Utilizing Propeller-Induced Propulsive Force.
CHI 2018: ACM Conference on Human Factors in Computing Systems. 11 pages.
Paper | Video | GitHub
Zhicong Lu, Haijun Xia, Seongkook Heo, Daniel Wigdor
You Watch, You Give, and You Engage: A Study of Live Streaming Practices in China.
CHI 2018: ACM Conference on Human Factors in Computing Systems. 13 pages.
Paper | Video
Sunggeun Ahn, Seongkook Heo, Geehyuk Lee.
Typing on a Smartwatch for Smart Glasses.
ISS 2017: ACM International Conference on Interactive Surfaces and Spaces.201-209.
Paper | Video
Seongkook Heo, Michelle Annett, Ben Lafreniere, Tovi Grossman, George Fitzmaurice.
No Need to Stop What You’re Doing: Exploring No-Handed Smartwatch Interaction.
GI 2017: Proceedings of Graphics Interface. 107 – 116.
Paper | Video
Seongkook Heo and Geehyuk Lee.
Vibrotactile Compliance Feedback for Tangential Force Interaction.
IEEE Transactions on Haptics, Vol. 10, Issue 3, 444-455. (2017)
Paper
Seongkook Heo, Jingun Jung, and Geehyuk Lee.
MelodicTap: Fingering Hotkey for Touch Tablets.
OzCHI ’16: Proceedings of Australian Conference on Human-Computer Interaction. 396-400.
Paper
Ken Hinckley, Seongkook Heo, Christian Holz, Hrvoje Benko, Abigail Sellen, Richard Banks, Kenton O'Hara, Gavin Smyth, and William Buxton.
Pre-Touch Sensing for Mobile Interaction.
CHI 2016: ACM Conference on Human Factors in Computing Systems. 2869-2881.
Paper | Video
Jonggi Hong, Seongkook Heo, Poika Isokoski, and Geehyuk Lee.
Comparison of Three QWERTY Keyboards for a Smartwatch.
Interacting with Computers, Vol. 28, Issue 6, 811-825. (2016)
Paper
Chang-Min Kim, Seongkook Heo, Kyeong Ah Jeong, and Youn-Kyung Lim.
Formula One: Mobile Device Supported Rapid In-the-Wild Design and Evaluation of Interactive Prototypes.
HCIK 2016: Proceedings of HCI Korea. 333-338.
Paper
Jonggi Hong, Seongkook Heo, Poika Isokoski, and Geehyuk Lee.
SplitBoard: A Simple Split Soft Keyboard for Wristwatch-sized Touch Screens.
CHI 2015: ACM Conference on Human Factors in Computing Systems. 1233-1236.
Paper | Video
Jaehyun Han, Seongkook Heo, Hyong-Euk Lee, and Geehyuk Lee.
IrPen: A 6-DOF Pen System to Support Over-the-surface Interactions with Tablet Computers.
IEEE Computer Graphics and Applications, Vol. 34, Issue 3, 22-29. (2014)
Paper | Video
Seongkook Heo, Jiseong Gu, and Geehyuk Lee.
Expanding Touch Input Vocabulary by Using Consecutive Distant Taps.
CHI 2014: ACM Conference on Human Factors in Computing Systems. 2597-2606.
Paper | Video
Seongkook Heo, Jaehyun Han, and Geehyuk Lee.
Designing Rich Touch Interaction through Proximity and 2.5D Force Sensing Touchpad.
OzCHI ’13: Proceedings of Australian Conference on Human-Computer Interaction. 401-404.
Paper
Seongkook Heo and Geehyuk Lee.
Indirect Shear Force Estimation for Multi-Point Shear Force Operations.
CHI 2013: ACM Conference on Human Factors in Computing Systems. 281-284.
Paper | Video
Jiseong Gu, Seongkook Heo, Jaehyun Han, Sunjun Kim, and Geehyuk Lee.
LongPad: A TouchPad Using the Whole Area below the Keyboard on a Laptop.
CHI 2013: ACM Conference on Human Factors in Computing Systems. 1421-1430.
Paper | Video
Jinhyuk Choi, Seongkook Heo, Jaehyun Han, Geehyuk Lee, and Junehwa Song.
Mining Social Relationship Types in an Organization by using Communication Patterns.
CSCW 2013: ACM Conference on Computer-Supported Cooperative Work. 295-302.
Paper
Jaehyun Han, Sangwon Choi, Seongkook Heo, and Geehyuk Lee.
Optical touch sensing based on internal scattering in a touch surface.
Electronics Letters, Vol. 48, Issue 22. (2012)
Paper
Seongkook Heo and Geehyuk Lee.
ForceDrag: Using Pressure as a Touch Input Modifier.
OzCHI '12: Proceedings of Australian Conference on Human-Computer Interaction. 204-207.
Paper | Video
Seongkook Heo, Jaehyun Han, Sangwon Choi, Seunghwan Lee, Geehyuk Lee, Hyong-Euk Lee, SangHyun Kim, Won-Chul Bang, DoKyoon Kim, and ChangYeong Kim.
IrCube tracker: an optical 6-DOF tracker based on LED directivity.
UIST 2011: ACM Symposium on User Interface Software and Technology. 577-586.
Paper | Video
Jaehyun Han, Seongkook Heo, Geehyuk Lee, Won-Chul Bang, DoKyoon Kim, and ChangYeong Kim.
6-DOF tracker using LED directivity.
Electronics Letters, Vol. 47, Issue 3. (2011)
Paper
Seongkook Heo and Geehyuk Lee.
Force gestures: augmenting touch screen gestures with normal and tangential forces.
UIST 2011: ACM Symposium on User Interface Software and Technology. 621-626.
Paper | Video
Seongkook Heo and Geehyuk Lee.
Forcetap: extending the input vocabulary of mobile touch screens by adding tap gestures.
MobileHCI 2011: ACM SIGCHI International Conference on Human Computer Interaction with Mobile Devices and Services. 113-122.
Paper | Video

Please see my CV for full list of publications and patents.
Last update: Feb 2019