Grupo de Procesado de Señal y Robótica
Escuela Politécnica Superior
Universitat de Lleida - UdL

Inicio   Equipo   Publicaciones   Proyectos   Cátedra-TIC   Contacto  
Agendas    2016   2015   2014   2013   2012   2011   2010   2009   2008  
Agenda del 2012

04/12/2012 - New paper published at the Scientific Journal "Precision Agriculture"


J. Arno, A. Escola, J.M. Valles, J. Llorens, R. Sanz, J. Masip, J. Palacin, J.R. Rosell-Polo, Leaf area index estimation in vineyards using a ground-based LiDAR scanner, Precision Agriculture, Volume 14, Issue 3 (2013), Page 290-306.

http://dx.doi.org/10.1007/s11119-012-9295-0

Abstract:
Estimation of grapevine vigour using mobile proximal sensors can provide an indirect method for determining grape yield and quality. Of the various indexes related to the characteristics of grapevine foliage, the leaf area index (LAI) is probably the most widely used in viticulture. To assess the feasibility of using light detection and ranging (LiDAR) sensors for predicting the LAI, several field trials were performed using a tractor-mounted LiDAR system. This system measured the crop in a transverse direction along the rows of vines and geometric and structural parameters were computed. The parameters evaluated were the height of the vines (H), the cross-sectional area (A), the canopy volume (V) and the tree area index (TAI). This last parameter was formulated as the ratio of the crop estimated area per unit ground area, using a local Poisson distribution to approximate the laser beam transmission probability within vines. In order to compare the calculated indexes with the actual values of LAI, the scanned vines were defoliated to obtain LAI values for different row sections. Linear regression analysis showed a good correlation (R2 = 0.81) between canopy volume and the measured values of LAI for 1 m long sections. Nevertheless, the best estimation of the LAI was given by the TAI (R2 = 0.92) for the same length, confirming LiDAR sensors as an interesting option for foliage characterization of grapevines. However, current limitations exist related to the complexity of data process and to the need to accumulate a sufficient number of scans to adequately estimate the LAI.

28/11/2012 - New paper published at the Scientific Journal "Sensors"


M. Teixido, T. Palleja, D, Font, M. Tresanchez, J. Moreno, J. Palacin, Two-Dimensional Radial Laser Scanning for Circular Marker Detection and External Mobile Robot Tracking, Sensors, 12 (2012), 16482-16497.

The Scientific Journal "Sensors" is Open Access and this paper is freely available to everybody at: http://dx.doi.org/10.3390/s121216482

Abstract:
This paper presents the use of an external fixed two-dimensional laser scanner to detect cylindrical targets attached to moving devices, such as a mobile robot. This proposal is based on the detection of circular markers in the raw data provided by the laser scanner by applying an algorithm for outlier avoidance and a least-squares circular fitting. Some experiments have been developed to empirically validate the proposal with different cylindrical targets in order to estimate the location and tracking errors achieved, which are generally less than 20 mm in the area covered by the laser sensor. As a result of the validation experiments, several error maps have been obtained in order to give an estimate of the uncertainty of any location computed. This proposal has been validated with a medium-sized mobile robot with an attached cylindrical target (diameter 200 mm). The trajectory of the mobile robot was estimated with an average location error of less than 15 mm, and the real location error in each individual circular fitting was similar to the error estimated with the obtained error maps. The radial area covered in this validation experiment was up to 10 m, a value that depends on the radius of the cylindrical target and the radial density of the distance range points provided by the laser scanner but this area can be increased by combining the information of additional external laser scanners.

17/11/2012 - Premio en el I Concurso de Proyectos de Promocion de la Escuela Politecnica Superior"


Dani Martinez y Eduard Clotet, estudiantes del Grupo de Investigacion en Robotica han ganado el I Concurso de Proyectos de Promocion de la Escuela Politecnica Superior de la Universitat de Lleida.

El proyecto presentado al premio tiene por titulo "Utilitzacio d’un Smartphone Android en el guiatge i control remot d’un petit cotxe electric / Utilizacion de un Smartphone Android para el guidao y control de un pequeño coche electrico". El premio permitira desarrollar e implementar el plenamente el proyecto presentado.

Los objetivos del proyecto son:
  • Aprovechar las prestaciones de un Smartphone para implementar un pequeño robot movil basado en un coche electrico; lo que va a requerir desarrollar una aplicación Android y una electronica de control para convertir el coche electrico en un robot.
  • Aprovechar la conectividad WiFi de los Smartphones para desarrollar una segunda aplicación Android para controlar remotamente el robot movil.
  • 16/11/2012 - Setmana de la Ciencia de Catalunya


    Setmana de la Ciencia de Catalunya: Taller de Robotica

    Descripcio del proces de creacio d'un robot mobil a partir d'un Smartphone Android
    Descripcion del proceso de creacion de un robot movil a partir de un Smartphone Android

  • Preu / Precio: Gratuit, requereix inscripcio previa / Gratuito, requiere inscripcion previa
  • Dia: 16/11/2012
  • Horaris / Horario del taller: de 10:00 a 11:30 // bis de 11:30 a 13:00
  • Lloc / Lugar: Sala de Graus, Escola Politecnica Superior de la Universitat de Lleida, C/ Jaume II, 69 (Campus de Cappont), 25001 Lleida
  • Inscripcio / Inscripcion: Places exhaurides / Ya no quedan plazas.
  • Web del Taller: web oficial, Setmana de la Ciencia de Catalunya

    Continguts
    En el taller es presentaran algunes de les activitats de recerca realitzades pel Grup de Robotica de la UdL i un dels treballs que s'estan realitzant actualment que consisteix en la creacio d'un petit robot mobil a partir d'un Smartphone Android.
  • 22/10/2012 - New paper published at the Scientific Journal "Sensors"


    M. Teixido, D, Font, T. Palleja, M. Tresanchez, M. Nogues, J. Palacin, An Embedded Real-Time Red Peach Detection System Based on an OV7670 Camera, ARM Cortex-M4 Processor and 3D Look-Up Tables, Sensors, 12 (2012), 14129-14143.

    The Scientific Journal "Sensors" is Open Access and this paper is freely available to everybody at: http://dx.doi.org/10.3390/s121014129

    Abstract:
    This work proposes the development of an embedded real-time fruit detection system for future automatic fruit harvesting. The proposed embedded system is based on an ARM Cortex-M4 (STM32F407VGT6) processor and an Omnivision OV7670 color camera. The future goal of this embedded vision system will be to control a robotized arm to automatically select and pick some fruit directly from the tree. The complete embedded system has been designed to be placed directly in the gripper tool of the future robotized harvesting arm. The embedded system will be able to perform real-time fruit detection and tracking by using a three-dimensional look-up-table (LUT) defined in the RGB color space and optimized for fruit picking. Additionally, two different methodologies for creating optimized 3D LUTs based on existing linear color models and fruit histograms were implemented in this work and compared for the case of red peaches. The resulting system is able to acquire general and zoomed orchard images and to update the relative tracking information of a red peach in the tree ten times per second.

    19/09/2012 - One scientific paper presented at RHEA 2012

    M. Tresanchez, M. Teixido, D. Font, T. Palleja, J. Palacin, Embedded Vision System for Real-Time Fruit Detection and Tracking, First International Conference on Robotics and Associated High-Techonologies and Equipment for Agriculture (RHEA 2012), 331-336, Pisa, Italy, September 19-21, 2012.


    Abstract
    This work presents an embedded vision system designed for real-time fruit detection and tracking. The system is based on one STM32F407VGT6 microcontroller module and one Omnivision OV7670 color camera module. The complete system will be included in the hand of a robotized arm designed for automatic fruit harvesting. The main objective of the vision system will be the control of the robotic arm in order to pickup some selected fruits

    27/07/2012 - HeadMouse 4.2

    HeadMouse 4.2 download HeadMouse es un raton virtual que se controla con ligeros movimientos de la cabeza y que permite hacer "clics" mediante diferentes gestos faciales.

    Ver video de demostracion

    Mejoras mas destacadas de la version 4.2 (julio 2012, download):

  • Nuevos manuales de usuario: Español, English, Portuguese (Brasil)1.
  • Se han incluido unos botones que permiten modificar el nivell de sensibilitat del clic con los ojos y con la boca.
  • Se ha incluido una zona de clic para verificar si el nivel de sensibilidad establecido es adecuado o no.
  • Se han añadido nuevas funciones en el menu de creacion de macros, ahora se puede abrir todavia mas facilmente cualquier web, archivo o programa.
  • Se ha mejorado el control del movimiento del cursor en las zonas cercanas a los extremos de la pantalla.
  • Se han incluido dos indicadores que muestran el numero de clics realizados y el desplazamiento acumulado del cursor durante la utilizacion del programa.
  • Se ha reducido el consumo de recursos y de memoria RAM.
  • Se ha actualizado la lista de webcams compatibles con HeadMouse.

    Evolucion de HeadMouse y acceso a versiones anteriores.

    1Manual en Portugues de Brasil gracias a Douglas Jerico.

    Videos de demostracion disponibles en YouTube
  • 27/07/2012 - VirtualKeyboard 3.2

    VirtualKeyboard 3.2 download VirtualKeyboard es un teclado virtual que se muestra en la pantalla del ordenador. Ha sido creado dentro del proyecto TeCLado Asistivo (TCLA keyboard) y su objetivo es permitir la escritura de textos mediante la pulsacion de las teclas del teclado mediante el raton informatico, HeadMouse o cualquier dispositivo apuntador conectado al ordenador.

    Ver video de demostracion

    Mejoras mas destacadas de la version 3.2 (julio 2012, download):

  • Nuevos manuales de usuario: Español, English, Portugues (Brasil)1.
  • Se ha revisado y mejorado el sistema de creacion de macros.
  • Se ha incluido una opcion que permite utilizar de forma flexible los botones centrales para realizar predicciones de palabras o para definir macros.
  • Se ha mejorado el sistema de autoclic y de clic por barrido (SweepClick).
  • Se ha modificado la implementacion de las telcas [SHIFT], [ALT] y [CTRL] para que sean compatibles con el uso de programas de videoconferencia.

    Evolucion de VirtualKeyboard y versiones anteriores.

    1Manual en Portugues de Brasil gracias a Douglas Jerico.

    Videos de demostracion disponibles en YouTube
  • 27/07/2012 - Presentados dos articulos cientificos en el congreso SAAEI 2012 - Guimaraes, Portugal

    M. Teixido, D. Font, T. Palleja, M. Tresanchez, J. Palacin, Ejemplo de caso practico de aprendizaje combinando vision artificial y un brazo robot, Seminario Anual de Automatica, Electronica Industrial e Instrumentacion (SAAEI 2012), 835-839, Guimaraes, Portugal, 11-13 Julio 2012.


    D. Font, M. Teixido, T. Palleja, M. Tresanchez, J. Palacin, Estudio preliminar de clasificacion de variedades de nectarinas en base al histograma de su color de piel, Seminario Anual de Automatica, Electronica Industrial e Instrumentacion (SAAEI 2012), 735-739, Guimaraes, Portugal, 11-13 Julio 2012.

    25/07/2012 - Verano / Summer / Estiu


    People: Dani Martinez, Davinia Font, Eduard Clotet, Marcel Tresanchez, Merce Teixido, Raul Balsa, Tomas Palleja (hidden), Javier Moreno, David Runcan (missing), Joan Salvatella (missing).
    Robots: rBot, Zep Zeti.

    11/06/2012 - Some images of projects under development

    7/06/2012 - New paper published at the Scientific Journal "Sensors"


    M. Teixido, D, Font, T. Palleja, M. Tresanchez, M. Nogues, J. Palacin, Definition of Linear Color Models in the RGB Vector Color Space to Detect Red Peaches in Orchard Images Taken under Natural Illumination, Sensors, 12 (2012), 7701-7718.

    The Scientific Journal "Sensors" is Open Access and this paper is freely available to everybody at: http://dx.doi.org/10.3390/s120607701

    Abstract: This work proposes the detection of red peaches in orchard images based on the definition of different linear color models in the RGB vector color space. The classification and segmentation of the pixels of the image is then performed by comparing the color distance from each pixel to the different previously defined linear color models. The methodology proposed has been tested with images obtained in a real orchard under natural light. The peach variety in the orchard was the paraguayo (Prunus persica var. platycarpa) peach with red skin. The segmentation results showed that the area of the red peaches in the images was detected with an average error of 11.6%; 19.7% in the case of bright illumination; 8.2% in the case of low illumination; 8.6% for occlusion up to 33%; 12.2% in the case of occlusion between 34 and 66%; and 23% for occlusion above 66%. Finally, a methodology was proposed to estimate the diameter of the fruits based on an ellipsoidal fitting. A first diameter was obtained by using all the contour pixels and a second diameter was obtained by rejecting some pixels of the contour. This approach enables a rough estimate of the fruit occlusion percentage range by comparing the two diameter estimates.

    13/05/2012 - Two scientific papers presented at 2012 IEEE I2MTC

    D. Font, T. Palleja, M. Tresanchez, M. Teixido, J. Palacin, Preliminary study on color based nectarine variety classification, 2012 IEEE International Instrumentation and Measurement Technology Conference (2012 IEEE I2MTC), Graz, Austria, May 13-16, 2012.


    M. Tresanchez, D. Font, M. Teixido, T. Palleja, J. Palacin, Preliminary Study of Pupil Detection and Tracking with Low Cost Optical Flow Sensors, Proceedings of the 2012 IEEE International Instrumentation and Measurement Technology Conference (2012 IEEE I2MTC), Graz, Austria, May 13-16, 2012.


    Soon available in: IEEExplore

    4/05/2012 - Video de demostracion de rBot

    Nuevo robot rBot (red roBot) disponible en el laboratorio para experimentacion.

    En el video de demostracion se muestra a rBot persiguiendo una pelota roja. La deteccion de la misma se realiza mediante una camara Minoru 3D con la que se obtiene una imagen estereo que permite estimar la orientacion de la pelota y la distancia a la que se encuentra del robot.

    Un video de demostracion se encuentra disponible en YouTube: http://youtu.be/EO7BzX0ntAo

    2/05/2012 - One scientific paper accepted at RHEA 2012

    One research paper will be presented at the First International Conference on Robotics and Associated High-Technologies and Equipment for Agriculture (RHEA 2012). September 19-21, Pisa, Italy.

    23/04/2012 - Two scientific papers accepted at SAAEI 2012

    Two research papers will be presented at the scientific confrerence: Seminario Anual de Automatica, Electronica Industrial e Instrumentacion (SAAEI 2012). July 11-13, Guimaraes, Portugal.

    18/03/2012 - Two scientific papers accepted at 2012 IEEE I2MTC

    Two research scientific papers will be presented at the 2012 IEEE International Instrumentation and Measurement Technology Conference (2012 IEEE I2MTC). May 13-16, Graz, Austria.

    21/02/2012 - Presentacion del robot humanoide NAO

    NAO by Aldebaran Robotics NAO in Lleida
    + YouTube videos

    NAO datasheet.pdf
    Presentacion del robot humanoide NAO por parte de la empresa Albebaran Robotics

    Dia: martes 21 de febrero
    Lugar: Sala 2.03 de la Escuela Politecnica Superior (Universitat de Lleida)
    Direccion: C/ Jaume II, 69. 25001 Lleida
    Hora: 16.00
    Idioma de la presentacion: Ingles y Castellano

    Outline of the presentation:
    • Presentation of Aldebaran Robotics, its research axis and partnerships (10-15 min)
    • Demonstration of the robotic platform NAO (15-20 min)
    • Workshop on programming NAO and its sub-systems: (30-40 min)
      • Choregraphe: a graphical programming environment
      • Developing in Python and C++
      • Open API
    • The monitoring and simulation environment (15-20 min)
    • Questions & answers / Turno abierto de preguntas (15-20 min)

    15/02/2012 - Catedra de Accesibilidad de la Universitat de Lleida

    Catedra de Accesibilidad a las TIC


    Rubrica del acuerdo de renovacion de la Catedra Indra-Fundacion Adecco de Accesibilidad a las TIC de la Universitat de Lleida

    Gracias a este acuerdo HeadMouse y VirtualKeyboard van a poder seguir evolucionando como herramientas de integracion laboral de personas con discapacidad gracias al mecenazgo de Indra y de la Fundacion Adecco.

    Link a la noticia oficial en la web de la UdL.


    9/02/2012 - PhD Position in Signal Processing

    PHD position PhD Position in Signal Processing (more information)

    The Robotics group of the University of Lleida and the Artificial Olfaction group of the Institute for Bioengineering of Catalonia (IBEC) are looking for a PhD candidate to apply for the 2012 FPI call from the Ministerio de Economia y Competitividad. The candidate must work at the IBEC (Barcelona).

    Tasks and responsibilities:
  • Development of algorithms for the navigation of a robot for the autonomous localisation of toxic leakages
  • Processing of Ion Mobility Spectra time-series
  • Programming for real time operation
  • Algorithms testing

    Application Deadline: 17 / 02 / 2012

    Please send application directly to the mail listed in IBEC/predoc.
  • Agendas    2016   2015   2014   2013   2012   2011   2010   2009   2008  
    Inicio   Equipo   Publicaciones   Proyectos   Cátedra-TIC   Contacto  


    © Grupo de Procesado de Señal y Robótica