DLR has set up a number of projects to increase flight safety and economics of aviation. Within these activities one field of interest is the development and validation of systems for pilot assistance in order to increase the situation awareness of the aircrew. The basic idea behind these systems is the principal of an ''electronic co-pilot''. All flight phases ("gate-to-gate") are taken into account, but as far as approaches, landing and taxiing are the most critical tasks in the field of civil aviation, special emphasis is given to these operations. Especially under adverse weather conditions the situation awareness of pilots is decreased in these critical flight phases due to the reduced visual range. Therefore, an Enhanced and Synthetic Vision System (ESVS) is integrated into the assistance system. Data acquired by weather penetrating sensors are combined with digital terrain data and status information by application of data fusion techniques. The resulting description of the situation is given to the pilot via head-up or head-down displays. One promising sensor for Enhanced Vision application is the 35 GHz MMW radar ''HiVision'' of EADS. This paper is focused on the automatic analysis of HiVision radar images with regard to the requirements for approach, landing, and taxiing. This includes the integrity monitoring of navigation data by conformity checks of database information with radar data, the detection of obstacles on the runway (or/and on taxiways) and the acquisition of navigation information by extracting runway structures from radar images.