This project applies automatic methods for classification and recognition of urine analysis microscopic images. We found that, it is necessary to apply automatization in the field of microscopic analyses of urine solution as detecting particles in the microscopic image is repeated and time consuming. Furthermore, Particles in many medical laboratory analyses have irregular shapes and blur edges. This project applies medical image processing algorithms and pattern recognition to microscopic images. It is composed of three stages: first, original urinary sediment microscopic images are transformed into binary image by image preprocessing including median filtering, color image conversion to gray scale image and image segmentation. Second, we select and extract some objects from images. Third, we classify the extracted images using SVM to recognize four kinds of urine sediment components: red blood cells, white blood cells, cast, calcium oxalate.
HCI-LAB , Interaction Group win the 3rd Position of Kinect Track and being marked by Microsoft ATL for GP 2012-2013 named RemoAct: Portable Projected Interface with Hand Gesture Interaction.
RemoAct is a wearable depth sensing and projection it makes interaction with the surrounding environment more intuitive through sharing and sending data with surrounding humans by applying certain gestures. Not only can you interact with humans but you can also send commands thorough gestures to other machines in the surrounding environment like printers, display screens, and projectors. Unlike some other wearable systems you have no need to wear gloves or even have a setup forevery environment, on the contrary this system offers a mobile and robust solution for interacting using a projected surface on habitual flat surfaces or even flat organic surfaces like the user’s hand, and at the same time interact with different people and different machines in different environments without the need for any further system configurations.