Academics, Publications & Research

Students/Faculty Member Publish Computer Science Work

Assistant Professor of Computer Science Janyl Jumadinova, Hanzhong Zheng ’17, and Almog Boanos ’17 have published their work “OWLS: Observational Wireless Life-enhancing System” in the Proceedings of the International Conference on Autonomous Agents and Multiagent Systems as a short paper and in the Proceedings of the Autonomous Robots and Multirobot Systems Workshop as a long paper. In this work, they developed a system where multiple wearable sensors, software agents, robots, and health analysis technology have been integrated into a single personal therapy solution, and demonstrated its effectiveness and efficiency. Dr. Jumadinova and Hanzhong Zheng published an article titled “Using Boolean Networks for Consensus in Multi-Robot Environmental Monitoring Tasks” in the Proceedings of the IEEE International Conference on Electro/Information Technology. In this paper, they presented a novel approach to multi-robot environmental monitoring based on dynamical systems, in which a robotic team overcomes data misinterpretation and aggregation difficulties through an effort of collaboration between all members of the team.

Source: Academics, Publications & Research

Hanzhong Zheng ’17 and Professor Jumadinova Publish Work in Artificial Intelligence Symposium

Hanzhong Zheng ’17 and Assistant Professor of Computer Science Janyl Jumadinova published a research article in the Association for the Advancement of Artificial Intelligence (AAAI) Symposium titled “Monitoring the Well-Being of a Person Using a Robotic-Sensor Framework.” The paper describes an integrated intelligent system, consisting of multiple mobile robots and wearable sensors, that is able to monitor and report on the health and the general well-being of an individual. The work was conducted during an independent study and student/faculty summer collaborative research funded by the provost’s office. This spring Hanzhong and Dr. Jumadinova will present their work at the AAAI Symposium, Well-Being Computing: AI Meets Health and Happiness Science.

Source: Academics, Publications & Research

Wenskovitch Publishes in Journal of Nursing Education and Practice

Visiting Assistant Professor of Computer Science John Wenskovitch, Debra Wolf from Chatham University, and Bonnie Anton from UPMC St. Margaret published an article — titled “Nurses’ Use of the Internet and Social Media: Does Age, Years of Experience, and Educational Level Make a Difference?” — in the peer-reviewed Journal of Nursing Education and Practice. The article summarizes a statistical analysis of a survey sent to nursing groups nationwide, looking at the safe and appropriate use of technology by nursing professionals.

Source: Academics, Publications & Research

Faculty Serve As Artists in Residence at Research Station in the High Arctic

Assistant Professor of Art Byron Rich, Visiting Assistant Professor of Art Heather Brand and Visiting Assistant Professor of Computer Science John Wenskovitch were selected as artists in residence at the prestigious Ars Bioarctica Residency, part of the University of Helsinki, at their biological research station in the high arctic of Finland. They spent two weeks exploring the ecology of this arctic region and producing work in response to their observations. In addition, Professors Rich, Brand and Wenskovitch presented their research at the 2016 meeting of the International Symposium on Electronic Art in Vancouver, Canada in August of 2015. They spoke about their collaborative work IMMOR(t)AL.

Source: Academics, Publications & Research

Looking for a Health Coach? Try a Robot

Imagine a world where robots help those with special health needs continue living independently at home.

That’s what Assistant Professor of Computer Science Janyl Jumadinova and three of her research students are striving to do.

We’re not talking about robots like Rosie from “The Jetsons.” What Jumadinova and Allegheny students Almog Boanos ’17, Michael Camara ’17 and Victor Zheng ’17 are doing is creating a monitoring system consisting of multiple robots, wearable sensors and software that can provide personalized monitoring of a user’s well-being in his or her own home.

For example, if a person is at risk for falling or for having a stroke, the robots can be trained to follow this person and monitor certain parts of his or her condition, such as temperature, speed, location and blood pressure. If there is a sudden change in the data – signaling a life-threatening situation – the robots can send an emergency message to a caregiver’s computer or cell phone, or to a doctor’s office. They also can send a message to 911, if needed.

Caregivers or physicians outside the home also can have access to the health data, allowing for continuous monitoring.

“With the growing special-needs and aging baby-boomer population, paired with a deficit in caregivers, there is an increasing need for personalized care,” Jumadinova says. “I have always had an interest in developing life-enhancing technologies, so that’s where this idea originated.”

The system requires the user to wear a small sensor which monitors the person’s vital signs, the researchers explain.

“My job for this project was to make sure the information coming from the sensor was transmitting to a database, which then analyzes the different health conditions,” says Boanos, who is double majoring in neuroscience and computer science. “If anything changes rapidly, the robot can sense the change and create an event, like calling emergency personnel. The GPS device can even give the person’s location, so it can send an ambulance if needed.”

According to Jumadinova, the sensor can communicate wirelessly with the robot, which looks similar to a Roomba vacuum cleaner with a laptop on it. The base, called a Turtlebot, is on wheels so it can move at different speeds.

Also part of the unit are Kinect sensors, which are the same sensors used in the Xbox gaming system. These sensors allow the computer to “see” a picture of a human. They also allow the robot to detect the distance between itself and an object in front of it.

“The laptop is basically the brain of the robot, and the Kinect sensors are the camera,” says Zheng, who is majoring in computer science with minors in math and economics. “My job for this project was writing algorithms to establish a connection between the laptop and the robot. ”

The third student, Camara, worked behind-the-scenes this summer to develop what’s called a “text mining system.”

“The robots can collect data and analyze it to find long-term trends. The data is then saved in a database, which then can be processed by the text mining system,” Jumadinova says. “The idea is that the robots may not see long-term trends, but the text mining system can go through the long-term data and find any alarming trends, and then notify the robots, if needed, by sending them a message.”

Allegheny currently has four robots. When turning them on, Jumadinova says they must first travel around the room to create a map of it. The robot then will use the map to find the person it is tracking.

“Only one robot will follow a person at a time. And if it needs to charge itself, it can go directly to its docking station and call another robot to its location,” she says.

“The greatest challenge has been getting the robots to talk to each other,” Zheng adds. “But they now can communicate and tell each other to ‘come here’ if needed.”

A Long Way From Home
While growing up in Israel, Almog Boanos ’17 always knew he wanted to do something with computers. As he grew older, he also became interested in neuroscience.

As he started to research colleges, he couldn’t find one in Israel that would allow him to pursue both passions. That’s when he found Allegheny.

“The only neuroscience program available in Israel was for Ph.D. students. Then I found Allegheny, which would allow me to double major in both,” he says.

Boanos would like someday to use small computers to simulate different neurons and see how different chemical changes affect brain activity.

“Eventually, I’d like to work with the Blue Brain Project, which is an attempt to reverse engineer the human brain and re-create it at the cellular level inside a computer simulation,” he says. “I hope my background here can help me get there.”

The opportunity to do this type of hands-on research as an undergraduate is surprising, says Boanos. “We are given a lot of independence. But if you have any questions, the professors are always there. It’s really amazing to be working on something like this as a junior. I can see the power of computer science through this project.”

Applying what he has learned in his computer classes to this research is enjoyable. “To me, computer science is about using all the information you learn in class in a really creative way. And learning how to program gives you the ability to use your imagination to create whatever you want; you can create amazing things. I think this project is impressive – it will really affect people’s lives,” says Boanos.

What’s next for the project? Jumadinova says they will continue testing and refining the system. But she doesn’t plan to stop there.

“In addition to monitoring a person, we hope that our team of robots will be able to provide motivation for cognitive and physical exercises to the user by considering the history of the user’s daily tasks and coaching the person to fulfill appropriate tasks, such as taking medicine, exercising or being socially active,” she says. “I also hope to meet with those in the medical community to get a better understanding of various health conditions so we can tailor the robots to those conditions.

“So far, I haven’t seen any other systems out there using data from wearable sensors with robots in this continuous way,” she adds. “It’s exciting.”

Source: Academics, Publications & Research

Just Learning in the Sand

Allegheny College’s newest piece of technology offers students a chance to roll up their sleeves and act like a kid again — a combination of sands and smarts. This augmented reality sandbox, located in the basement of Alden Hall, arrived in late January and creates three-dimensional topographical maps based on the way students physically shape the sand.

Read more.

Tyler Pecyna is the fact-checker for Pittsburgh Magazine. This article appeared in Pittsburgh Magazine’s Great Minds newsletter.

Source: Academics, Publications & Research

Students Get Their Hands Dirty With New Augmented Reality Sandbox

Allegheny senior Kristy Garcia rolled up her sleeves and dug right into the sandbox, piling up clean, white sand to form a mountain.

Senior David Olson joined in as well, using his fingers to dig a trench at the base of the mountain.

As they watched the colors change from deep reds and oranges to bright greens to blues, they braced themselves for the fun part – placing their hand over the camera overlooking the sandbox to “make it rain.”

“That is so cool!” the wide-eyed environmental science majors said in unison as virtual rain washed over the mountain and sloshed into the trench.

It’s a common reaction when someone first sees Allegheny’s newest piece of technology, the augmented reality (AR) sandbox, in the basement of Alden Hall.

The AR sandbox, which arrived at Allegheny in January, combines the playfulness of a child’s sandbox with advanced technology to create a learning tool that can be used by students of all ages. When students shape the sand, a Microsoft Kinect 3-D camera and a projector with powerful software detect the movement and display a three-dimensional topographic and colored elevation map in real time.

According to Sam Reese, lab technician for the geology and environmental science departments, unlike street maps, topographic maps display 3-D characteristics of an area using lines, called contours, to represent elevation above or below sea level. Using topographic maps, engineers know where best to build a road, scientists know where rainwater will flow after a storm and hikers know where a trail is steepest.

“By using this technology, students can actually see how a topographic map portrays a 3-D world. Sometimes people don’t grasp that concept on a flat 2-D map,” Reese says. “The beauty of the sandbox is the simplicity of the model, as it tells a very complicated story.”

Reese explains that the College acquired the materials to construct the sandbox through a grant from the Pennsylvania Department of Environmental Protection. Allegheny carpenters built the actual box, and Craig Newell Welding in Cambridge Springs, Pa., built the metal apparatus that holds the camera and software in place. Dave Wagner, network and systems administrator in computer science and information technology services, set up the operating system and installed the software.

The idea for the AR sandbox came from a group of Czech researchers who posted a YouTube video displaying an early prototype that included elevation maps and a basic form of fluid movement, Reese says. A team at the W.M. Keck Center for Active Visualization in the Earth Sciences (KeckCaves) at the University of California Davis then added the topographic contour lines and improved the simulated fluid flow to create the current prototype. UC Davis provides the blueprints to build the system as well as the necessary software free of charge on its website.

Reese estimates that only a couple dozen AR sandboxes exist, mainly at museums. “It’s so new. The day our sandbox went live – Jan. 21 – an article appeared in the New York Times about augmented reality,” he says. “It’s really cutting edge for Allegheny to have this.”

AR 4

Allegheny senior Kristy Garcia digs in the AR sandbox.

In addition to the geology and environmental science departments using the sandbox in labs and for independent research projects, the computer science and biology departments also plan to incorporate the technology into their class curricula.

College students won’t be the only ones digging in the sand. Creek Connections, a partnership between the College and K-12 schools that focuses on hands-on watershed education, plans to incorporate the AR sandbox in activities that explore topographic maps, watersheds and stream geology.

“People are used to street maps and Google maps that are very flat. But when we talk about watershed delineation and where rain will go, the concept becomes much easier when you can use a 3-D topographic map like this,” says Wendy Kedzierski, director of Creek Connections. “With the sandbox, you can see it as the sand builds up and the colors change. It makes the connection so much easier.”

Student Kristy Garcia, who works as a project assistant with Kedzierski and the Creek Connections program, agrees. “It’s definitely easier to understand topography when looking at the sandbox,” she says.

Kedzierski believes another benefit is that the sandbox will give students who prefer hands-on activities another opportunity for learning.

“The education that we provide in schools is a lot different from what they do every day in the classroom. Some of the children who have a hard time with traditional lecturing react differently when we do our Creek Connections activities,” Kedzierski says. “This is another tactile experience for those students.”

Reese believes that the AR sandbox is just the tip of the iceberg when it comes to hands-on education.

“I believe virtual reality is going to augment the augmented reality,” he says. “It will be interesting to see how the AR software upgrades will add more bells and whistles to the sandbox over the next year or two.”

Source: Academics, Publications & Research