Zhou co-directs two labs at the College and serves as a judge for HackDartmouth.
Xia Zhou is a computer science professor at the College specializing in mobile computing and visible light sensing. She was recently awarded the 2019 Association for Computing Machines SIGMOBILE RockStar award for “outstanding early-career contributions and impact on [the] field” this March. In 2017, she added a Sloan Research Fellowship to her other accolades, including having her work featured in a National Science Foundation-sponsored video. She co-directs both the Dartmouth Networks and Ubiquitous Systems Lab and the Dartmouth Reality and Robotics Lab at the College, and has taught several courses including COSC 60, “Computer Networks,” and COSC 50, “Software Design & Implementation.” Last weekend, she was a judge at “HackDartmouth.”
How did you first get interested in computer science?
XZ: I had never thought of computer science as a career plan while growing up. I grew up in the southern region of China, and when I was young, I was actually more interested in literature and the arts. But in my second year of high school — when all the Chinese students must make a decision on the broader area that they want to study for the future — there were two options: science and technology or literature and the humanities. My teacher recommended that I choose arts and literature, as the stereotypes back then were that the girls were better at the arts and boys were better at science and technology. But my parents didn’t take my teacher’s recommendation — they wanted me to be financially independent in the future, and they thought that learning science and technology was how to get there. I chose computer science because of how difficult it was to get in to — and because I had the scores to get into the program. So it was by accident I came across computer science at a relatively late stage — I actually hadn’t used a computer before my undergraduate studies.
You recently won the RockStar Award for your work in visible light sensing. Can you tell me a little about the field?
XZ: So the idea is to utilize the ubiquitous lights around us, like ceiling lights, table lamps and even screen lights, as a sensing medium to sense how your body is interacting with the surrounding light rays. Then, we use that information to infer fine-grained behaviors, such as your pose, whether of your whole body or just of a hand or even behavior like pupil movement.
What are the applications of using visible light in this way?
XZ: So, one application is health monitoring. Through visible light sensing, you can enable continuous health monitoring in a manner that preserves privacy. Instead of cameras, you have a few light sensors that are tiny and small and cheap that are embedded in the environment. The sensors can detect how your body is interacting with or blocking visible light and then we aggregate the information, which can be further analyzed to infer higher-level behavioral characteristics and allow study into whether there are correlations between the high-level behaviors and one’s mental state or even other health issues. Another type of application is human-computer interaction, turning light into an interaction interface by gesturing or moving around. You could potentially control devices in the environment without having sensors on the body or cameras around watching you all the time.
What first got you interested in studying light?
XZ: I did not study anything related to light during my Ph.D.; it was something I started doing after coming to Dartmouth. I came across the topic after I attended a conference in our field in 2013, and there was a workshop keynote in the conference where the speaker talked about the potential of using the visible light spectrum for communication. That was my first time hearing the concept and I was super fascinated by the idea. I put together a National Science Foundation Grant proposal for research in this field and then started this line of research from there.
Mobile computing is your other field of expertise. Can you tell me a little about your work in mobile computing?
XZ: Most of the research that I’m working on now is centered around light, but one of the challenges in mobile computing I’m working on is energy efficiency. As devices get smaller and more ubiquitous and also have a need for continuous sensing, they have imposed a very high requirement in terms of energy efficiency for all those devices. For example, most recently, we designed battery-free eye-trackers based on light sensing that are powered by solar cells attached to the side arms of the glasses, harvesting energy from indoor light.
Can you discuss the work you do in the two labs you co-direct?
XZ: DartNets was the lab that I co-founded when I first arrived here, and since the end of last year, we started another lab together with faculty in robotics and computational fabrication: RLab. So there are quite a few interesting projects going on in that lab as well. They are often at the intersection of mobile computing, robotics and fabrication. For instance, we are collaborating with the robotics faculty to see how light sensing and communication can be applied to facilitate a robot’s communication and sense of the environment. In particular, we are looking at underwater environments and seeing how light sensing and communication can help underwater robotics.
Is there any other research you’re currently undertaking?
XZ: Another direction I’m working on is with Devin Balkcom, a robotics faculty member, and David Kraemer, who is faculty in the education department, on a project about human motion. There are many tasks that involve physical motions that are often very hard to teach. Unlike math or physics, for which you can watch a video to see the experiment and grasp the concept, motion tasks like yoga or learning some sport are trickier to learn and harder to correct. Oftentimes, the teacher can only observe what you do and then give you some feedback afterward — it’s very hard to get precise on the spot feedback while you are performing the motion. We are trying to build a holistic system that can continuously sense how your joints are moving over time, and in the meantime, we’re trying to leverage technology like augmented and virtual reality to give you real-time feedback so that you can see how you are performing the motion in real time and correct the motion in real-time.
You were a judge for HackDartmouth this past weekend. How was that experience?
XZ: I love HackDartmouth. I have been a judge for the event many, many times, almost as many times as there have been HackDartmouths. I’m always very happy to see all the projects that the students come up with, and I hope that it’s a useful experience for them to go through the whole process of coming up with the idea, executing the idea and presenting the idea. I’m happy to be part of that process to help them more.
Do you have any advice for students?
XZ: My advice is to not be intimidated by students who might appear more experienced or knowledgeable than you about a certain subject you are interested in. Don’t be afraid of starting something a little bit late — you can always catch up with drive and commitment. That was a lesson I personally learned. A lot of my classmates had used computers or even knew one or two languages already when I had just learned to type that summer. The key thing is to learn how to turn that high pressure into a driving force that can drive you to work hard. You can always catch up.