991 resultados para resolution prediction


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quality of temperature and humidity retrievals from the infrared SEVIRI sensors on the geostationary Meteosat Second Generation (MSG) satellites is assessed by means of a one dimensional variational algorithm. The study is performed with the aim of improving the spatial and temporal resolution of available observations to feed analysis systems designed for high resolution regional scale numerical weather prediction (NWP) models. The non-hydrostatic forecast model COSMO (COnsortium for Small scale MOdelling) in the ARPA-SIM operational configuration is used to provide background fields. Only clear sky observations over sea are processed. An optimised 1D–VAR set-up comprising of the two water vapour and the three window channels is selected. It maximises the reduction of errors in the model backgrounds while ensuring ease of operational implementation through accurate bias correction procedures and correct radiative transfer simulations. The 1D–VAR retrieval quality is firstly quantified in relative terms employing statistics to estimate the reduction in the background model errors. Additionally the absolute retrieval accuracy is assessed comparing the analysis with independent radiosonde and satellite observations. The inclusion of satellite data brings a substantial reduction in the warm and dry biases present in the forecast model. Moreover it is shown that the retrieval profiles generated by the 1D–VAR are well correlated with the radiosonde measurements. Subsequently the 1D–VAR technique is applied to two three–dimensional case–studies: a false alarm case–study occurred in Friuli–Venezia–Giulia on the 8th of July 2004 and a heavy precipitation case occurred in Emilia–Romagna region between 9th and 12th of April 2005. The impact of satellite data for these two events is evaluated in terms of increments in the integrated water vapour and saturation water vapour over the column, in the 2 meters temperature and specific humidity and in the surface temperature. To improve the 1D–VAR technique a method to calculate flow–dependent model error covariance matrices is also assessed. The approach employs members from an ensemble forecast system generated by perturbing physical parameterisation schemes inside the model. The improved set–up applied to the case of 8th of July 2004 shows a substantial neutral impact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Different types of proteins exist with diverse functions that are essential for living organisms. An important class of proteins is represented by transmembrane proteins which are specifically designed to be inserted into biological membranes and devised to perform very important functions in the cell such as cell communication and active transport across the membrane. Transmembrane β-barrels (TMBBs) are a sub-class of membrane proteins largely under-represented in structure databases because of the extreme difficulty in experimental structure determination. For this reason, computational tools that are able to predict the structure of TMBBs are needed. In this thesis, two computational problems related to TMBBs were addressed: the detection of TMBBs in large datasets of proteins and the prediction of the topology of TMBB proteins. Firstly, a method for TMBB detection was presented based on a novel neural network framework for variable-length sequence classification. The proposed approach was validated on a non-redundant dataset of proteins. Furthermore, we carried-out genome-wide detection using the entire Escherichia coli proteome. In both experiments, the method significantly outperformed other existing state-of-the-art approaches, reaching very high PPV (92%) and MCC (0.82). Secondly, a method was also introduced for TMBB topology prediction. The proposed approach is based on grammatical modelling and probabilistic discriminative models for sequence data labeling. The method was evaluated using a newly generated dataset of 38 TMBB proteins obtained from high-resolution data in the PDB. Results have shown that the model is able to correctly predict topologies of 25 out of 38 protein chains in the dataset. When tested on previously released datasets, the performances of the proposed approach were measured as comparable or superior to the current state-of-the-art of TMBB topology prediction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate anharmonic experimental vibrational frequencies for water clusters consisting of 2−5 water molecules have been predicted on the basis of comparing different methods with MP2/aug-cc-pVTZ calculated and experimental anharmonic frequencies. The combination of using HF/6-31G* scaled frequencies for intramolecular modes and anharmonic frequencies for intermolecular modes gives excellent agreement with experiment for the water dimer and trimer and are as good as the expensive anharmonic MP2 calculations. The water trimer, the cyclic Ci and S4 tetramers, and the cyclic pentamer all have unique peaks in the infrared spectrum between 500 and 800 cm-1 and between 3400 and 3700 cm-1. Under the right experimental conditions these different clusters can be uniquely identified using high-resolution IR spectroscopy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Areal bone mineral density (aBMD) at the distal tibia, measured at the epiphysis (T-EPI) and diaphysis (T-DIA), is predictive for fracture risk. Structural bone parameters evaluated at the distal tibia by high resolution peripheral quantitative computed tomography (HR-pQCT) displayed differences between healthy and fracture patients. With its simple geometry, T-DIA may allow investigating the correlation between bone structural parameter and bone strength.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genome predictions based on selected genes would be a very welcome approach for taxonomic studies, including DNA-DNA similarity, G+C content and representative phylogeny of bacteria. At present, DNA-DNA hybridizations are still considered the gold standard in species descriptions. However, this method is time-consuming and troublesome, and datasets can vary significantly between experiments as well as between laboratories. For the same reasons, full matrix hybridizations are rarely performed, weakening the significance of the results obtained. The authors established a universal sequencing approach for the three genes recN, rpoA and thdF for the Pasteurellaceae, and determined if the sequences could be used for predicting DNA-DNA relatedness within the family. The sequence-based similarity values calculated using a previously published formula proved most useful for species and genus separation, indicating that this method provides better resolution and no experimental variation compared to hybridization. By this method, cross-comparisons within the family over species and genus borders easily become possible. The three genes also serve as an indicator of the genome G+C content of a species. A mean divergence of around 1 % was observed from the classical method, which in itself has poor reproducibility. Finally, the three genes can be used alone or in combination with already-established 16S rRNA, rpoB and infB gene-sequencing strategies in a multisequence-based phylogeny for the family Pasteurellaceae. It is proposed to use the three sequences as a taxonomic tool, replacing DNA-DNA hybridization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-resolution, well-calibrated records of lake sediments are critically important for quantitative climate reconstructions, but they remain a methodological and analytical challenge. While several comprehensive paleotemperature reconstructions have been developed across Europe, only a few quantitative high-resolution studies exist for precipitation. Here we present a calibration and verification study of lithoclastic sediment proxies from proglacial Lake Oeschinen (46°30′N, 7°44′E, 1,580 m a.s.l., north–west Swiss Alps) that are sensitive to rainfall for the period AD 1901–2008. We collected two sediment cores, one in 2007 and another in 2011. The sediments are characterized by two facies: (A) mm-laminated clastic varves and (B) turbidites. The annual character of the laminae couplets was confirmed by radiometric dating (210Pb, 137Cs) and independent flood-layer chronomarkers. Individual varves consist of a dark sand-size spring-summer layer enriched in siliciclastic minerals and a lighter clay-size calcite-rich winter layer. Three subtypes of varves are distinguished: Type I with a 1–1.5 mm fining upward sequence; Type II with a distinct fine-sand base up to 3 mm thick; and Type III containing multiple internal microlaminae caused by individual summer rainstorm deposits. Delta-fan surface samples and sediment trap data fingerprint different sediment source areas and transport processes from the watershed and confirm the instant response of sediment flux to rainfall and erosion. Based on a highly accurate, precise and reproducible chronology, we demonstrate that sediment accumulation (varve thickness) is a quantitative predictor for cumulative boreal alpine spring (May–June) and spring/summer (May–August) rainfall (rMJ = 0.71, rMJJA = 0.60, p < 0.01). Bootstrap-based verification of the calibration model reveals a root mean squared error of prediction (RMSEPMJ = 32.7 mm, RMSEPMJJA = 57.8 mm) which is on the order of 10–13 % of mean MJ and MJJA cumulative precipitation, respectively. These results highlight the potential of the Lake Oeschinen sediments for high-resolution reconstructions of past rainfall conditions in the northern Swiss Alps, central and eastern France and south-west Germany.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Article preview View full access options BoneKEy Reports | Review Print Email Share/bookmark Finite element analysis for prediction of bone strength Philippe K Zysset, Enrico Dall'Ara, Peter Varga & Dieter H Pahr Affiliations Corresponding author BoneKEy Reports (2013) 2, Article number: 386 (2013) doi:10.1038/bonekey.2013.120 Received 03 January 2013 Accepted 25 June 2013 Published online 07 August 2013 Article tools Citation Reprints Rights & permissions Abstract Abstract• References• Author information Finite element (FE) analysis has been applied for the past 40 years to simulate the mechanical behavior of bone. Although several validation studies have been performed on specific anatomical sites and load cases, this study aims to review the predictability of human bone strength at the three major osteoporotic fracture sites quantified in recently completed in vitro studies at our former institute. Specifically, the performance of FE analysis based on clinical computer tomography (QCT) is compared with the ones of the current densitometric standards, bone mineral content, bone mineral density (BMD) and areal BMD (aBMD). Clinical fractures were produced in monotonic axial compression of the distal radii, vertebral sections and in side loading of the proximal femora. QCT-based FE models of the three bones were developed to simulate as closely as possible the boundary conditions of each experiment. For all sites, the FE methodology exhibited the lowest errors and the highest correlations in predicting the experimental bone strength. Likely due to the improved CT image resolution, the quality of the FE prediction in the peripheral skeleton using high-resolution peripheral CT was superior to that in the axial skeleton with whole-body QCT. Because of its projective and scalar nature, the performance of aBMD in predicting bone strength depended on loading mode and was significantly inferior to FE in axial compression of radial or vertebral sections but not significantly inferior to FE in side loading of the femur. Considering the cumulated evidence from the published validation studies, it is concluded that FE models provide the most reliable surrogates of bone strength at any of the three fracture sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-resolution, ground-based and independent observations including co-located wind radiometer, lidar stations, and infrasound instruments are used to evaluate the accuracy of general circulation models and data-constrained assimilation systems in the middle atmosphere at northern hemisphere midlatitudes. Systematic comparisons between observations, the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analyses including the recent Integrated Forecast System cycles 38r1 and 38r2, the NASA’s Modern-Era Retrospective Analysis for Research and Applications (MERRA) reanalyses, and the free-running climate Max Planck Institute–Earth System Model–Low Resolution (MPI-ESM-LR) are carried out in both temporal and spectral dom ains. We find that ECMWF and MERRA are broadly consistent with lidar and wind radiometer measurements up to ~40 km. For both temperature and horizontal wind components, deviations increase with altitude as the assimilated observations become sparser. Between 40 and 60 km altitude, the standard deviation of the mean difference exceeds 5 K for the temperature and 20 m/s for the zonal wind. The largest deviations are observed in winter when the variability from large-scale planetary waves dominates. Between lidar data and MPI-ESM-LR, there is an overall agreement in spectral amplitude down to 15–20 days. At shorter time scales, the variability is lacking in the model by ~10 dB. Infrasound observations indicate a general good agreement with ECWMF wind and temperature products. As such, this study demonstrates the potential of the infrastructure of the Atmospheric Dynamics Research Infrastructure in Europe project that integrates various measurements and provides a quantitative understanding of stratosphere-troposphere dynamical coupling for numerical weather prediction applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Patient-specific biomechanical models including local bone mineral density and anisotropy have gained importance for assessing musculoskeletal disorders. However the trabecular bone anisotropy captured by high-resolution imaging is only available at the peripheral skeleton in clinical practice. In this work, we propose a supervised learning approach to predict trabecular bone anisotropy that builds on a novel set of pose invariant feature descriptors. The statistical relationship between trabecular bone anisotropy and feature descriptors were learned from a database of pairs of high resolution QCT and clinical QCT reconstructions. On a set of leave-one-out experiments, we compared the accuracy of the proposed approach to previous ones, and report a mean prediction error of 6% for the tensor norm, 6% for the degree of anisotropy and 19◦ for the principal tensor direction. These findings show the potential of the proposed approach to predict trabecular bone anisotropy from clinically available QCT images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La gestión del tráfico aéreo (Air Traffic Management, ATM) está experimentando un cambio de paradigma hacia las denominadas operaciones basadas trayectoria. Bajo dicho paradigma se modifica el papel de los controladores de tráfico aéreo desde una operativa basada su intervención táctica continuada hacia una labor de supervisión a más largo plazo. Esto se apoya en la creciente confianza en las soluciones aportadas por las herramientas automatizadas de soporte a la decisión más modernas. Para dar soporte a este concepto, se precisa una importante inversión para el desarrollo, junto con la adquisición de nuevos equipos en tierra y embarcados, que permitan la sincronización precisa de la visión de la trayectoria, basada en el intercambio de información entre ambos actores. Durante los últimos 30 a 40 años las aerolíneas han generado uno de los menores retornos de la inversión de entre todas las industrias. Sin beneficios tangibles, la industria aérea tiene dificultades para atraer el capital requerido para su modernización, lo que retrasa la implantación de dichas mejoras. Esta tesis tiene como objetivo responder a la pregunta de si las capacidades actualmente instaladas en las aeronaves comerciales se pueden aplicar para lograr la sincronización de la trayectoria con el nivel de calidad requerido. Además, se analiza en ella si, conjuntamente con mejoras en las herramientas de predicción trayectorias instaladas en tierra en para facilitar la gestión de las arribadas, dichas capacidades permiten obtener los beneficios esperados en el marco de las operaciones basadas en trayectoria. Esto podría proporcionar un incentivo para futuras actualizaciones de la aviónica que podrían llevar a mejoras adicionales. El concepto operacional propuesto en esta tesis tiene como objetivo permitir que los aviones sean pilotados de una manera consistente con las técnicas actuales de vuelo optimizado. Se permite a las aeronaves que desciendan en el denominado “modo de ángulo de descenso gestionado” (path-managed mode), que es el preferido por la mayoría de las compañías aéreas, debido a que conlleva un reducido consumo de combustible. El problema de este modo es que en él no se controla de forma activa el tiempo de llegada al punto de interés. En nuestro concepto operacional, la incertidumbre temporal se gestiona en mediante de la medición del tiempo en puntos estratégicamente escogidos a lo largo de la trayectoria de la aeronave, y permitiendo la modificación por el control de tierra de la velocidad de la aeronave. Aunque la base del concepto es la gestión de las ordenes de velocidad que se proporcionan al piloto, para ser capaces de operar con los niveles de equipamiento típicos actualmente, dicho concepto también constituye un marco en el que la aviónica más avanzada (por ejemplo, que permita el control por el FMS del tiempo de llegada) puede integrarse de forma natural, una vez que esta tecnología este instalada. Además de gestionar la incertidumbre temporal a través de la medición en múltiples puntos, se intenta reducir dicha incertidumbre al mínimo mediante la mejora de las herramienta de predicción de la trayectoria en tierra. En esta tesis se presenta una novedosa descomposición del proceso de predicción de trayectorias en dos etapas. Dicha descomposición permite integrar adecuadamente los datos de la trayectoria de referencia calculada por el Flight Management System (FMS), disponibles usando Futuro Sistema de Navegación Aérea (FANS), en el sistema de predicción de trayectorias en tierra. FANS es un equipo presente en los aviones comerciales de fuselaje ancho actualmente en la producción, e incluso algunos aviones de fuselaje estrecho pueden tener instalada avionica FANS. Además de informar automáticamente de la posición de la aeronave, FANS permite proporcionar (parte de) la trayectoria de referencia en poder de los FMS, pero la explotación de esta capacidad para la mejora de la predicción de trayectorias no se ha estudiado en profundidad en el pasado. La predicción en dos etapas proporciona una solución adecuada al problema de sincronización de trayectorias aire-tierra dado que permite la sincronización de las dimensiones controladas por el sistema de guiado utilizando la información de la trayectoria de referencia proporcionada mediante FANS, y también facilita la mejora en la predicción de las dimensiones abiertas restantes usado un modelo del guiado que explota los modelos meteorológicos mejorados disponibles en tierra. Este proceso de predicción de la trayectoria de dos etapas se aplicó a una muestra de 438 vuelos reales que realizaron un descenso continuo (sin intervención del controlador) con destino Melbourne. Dichos vuelos son de aeronaves del modelo Boeing 737-800, si bien la metodología descrita es extrapolable a otros tipos de aeronave. El método propuesto de predicción de trayectorias permite una mejora en la desviación estándar del error de la estimación del tiempo de llegada al punto de interés, que es un 30% menor que la que obtiene el FMS. Dicha trayectoria prevista mejorada se puede utilizar para establecer la secuencia de arribadas y para la asignación de las franjas horarias para cada aterrizaje (slots). Sobre la base del slot asignado, se determina un perfil de velocidades que permita cumplir con dicho slot con un impacto mínimo en la eficiencia del vuelo. En la tesis se propone un nuevo algoritmo que determina las velocidades requeridas sin necesidad de un proceso iterativo de búsqueda sobre el sistema de predicción de trayectorias. El algoritmo se basa en una parametrización inteligente del proceso de predicción de la trayectoria, que permite relacionar el tiempo estimado de llegada con una función polinómica. Resolviendo dicho polinomio para el tiempo de llegada deseado, se obtiene de forma natural el perfil de velocidades optimo para cumplir con dicho tiempo de llegada sin comprometer la eficiencia. El diseño de los sistemas de gestión de arribadas propuesto en esta tesis aprovecha la aviónica y los sistemas de comunicación instalados de un modo mucho más eficiente, proporcionando valor añadido para la industria. Por tanto, la solución es compatible con la transición hacia los sistemas de aviónica avanzados que están desarrollándose actualmente. Los beneficios que se obtengan a lo largo de dicha transición son un incentivo para inversiones subsiguientes en la aviónica y en los sistemas de control de tráfico en tierra. ABSTRACT Air traffic management (ATM) is undergoing a paradigm shift towards trajectory based operations where the role of an air traffic controller evolves from that of continuous intervention towards supervision, as decision making is improved based on increased confidence in the solutions provided by advanced automation. To support this concept, significant investment for the development and acquisition of new equipment is required on the ground as well as in the air, to facilitate the high degree of trajectory synchronisation and information exchange required. Over the past 30-40 years the airline industry has generated one of the lowest returns on invested capital among all industries. Without tangible benefits realised, the airline industry may find it difficult to attract the required investment capital and delay acquiring equipment needed to realise the concept of trajectory based operations. In response to these challenges facing the modernisation of ATM, this thesis aims to answer the question whether existing aircraft capabilities can be applied to achieve sufficient trajectory synchronisation and improvements to ground-based trajectory prediction in support of the arrival management process, to realise some of the benefits envisioned under trajectory based operations, and to provide an incentive for further avionics upgrades. The proposed operational concept aims to permit aircraft to operate in a manner consistent with current optimal aircraft operating techniques. It allows aircraft to descend in the fuel efficient path managed mode as preferred by a majority of airlines, with arrival time not actively controlled by the airborne automation. The temporal uncertainty is managed through metering at strategically chosen points along the aircraft’s trajectory with primary use of speed advisories. While the focus is on speed advisories to support all aircraft and different levels of equipage, the concept also constitutes a framework in which advanced avionics as airborne time-of-arrival control can be integrated once this technology is widely available. In addition to managing temporal uncertainty through metering at multiple points, this temporal uncertainty is minimised by improving the supporting trajectory prediction capability. A novel two-stage trajectory prediction process is presented to adequately integrate aircraft trajectory data available through Future Air Navigation Systems (FANS) into the ground-based trajectory predictor. FANS is standard equipment on any wide-body aircraft in production today, and some single-aisle aircraft are easily capable of being fitted with FANS. In addition to automatic position reporting, FANS provides the ability to provide (part of) the reference trajectory held by the aircraft’s Flight Management System (FMS), but this capability has yet been widely overlooked. The two-stage process provides a ‘best of both world’s’ solution to the air-ground synchronisation problem by synchronising with the FMS reference trajectory those dimensions controlled by the guidance mode, and improving on the prediction of the remaining open dimensions by exploiting the high resolution meteorological forecast available to a ground-based system. The two-stage trajectory prediction process was applied to a sample of 438 FANS-equipped Boeing 737-800 flights into Melbourne conducting a continuous descent free from ATC intervention, and can be extrapolated to other types of aircraft. Trajectories predicted through the two-stage approach provided estimated time of arrivals with a 30% reduction in standard deviation of the error compared to estimated time of arrival calculated by the FMS. This improved predicted trajectory can subsequently be used to set the sequence and allocate landing slots. Based on the allocated landing slot, the proposed system calculates a speed schedule for the aircraft to meet this landing slot at minimal flight efficiency impact. A novel algorithm is presented that determines this speed schedule without requiring an iterative process in which multiple calls to a trajectory predictor need to be made. The algorithm is based on parameterisation of the trajectory prediction process, allowing the estimate time of arrival to be represented by a polynomial function of the speed schedule, providing an analytical solution to the speed schedule required to meet a set arrival time. The arrival management solution proposed in this thesis leverages the use of existing avionics and communications systems resulting in new value for industry for current investment. The solution therefore supports a transition concept from mixed equipage towards advanced avionics currently under development. Benefits realised under this transition may provide an incentive for ongoing investment in avionics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method is presented for computing the average solution of problems that are too complicated for adequate resolution, but where information about the statistics of the solution is available. The method involves computing average derivatives by interpolation based on linear regression, and an updating of a measure constrained by the available crude information. Examples are given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The polypeptide backbones and side chains of proteins are constantly moving due to thermal motion and the kinetic energy of the atoms. The B-factors of protein crystal structures reflect the fluctuation of atoms about their average positions and provide important information about protein dynamics. Computational approaches to predict thermal motion are useful for analyzing the dynamic properties of proteins with unknown structures. In this article, we utilize a novel support vector regression (SVR) approach to predict the B-factor distribution (B-factor profile) of a protein from its sequence. We explore schemes for encoding sequences and various settings for the parameters used in SVR. Based on a large dataset of high-resolution proteins, our method predicts the B-factor distribution with a Pearson correlation coefficient (CC) of 0.53. In addition, our method predicts the B-factor profile with a CC of at least 0.56 for more than half of the proteins. Our method also performs well for classifying residues (rigid vs. flexible). For almost all predicted B-factor thresholds, prediction accuracies (percent of correctly predicted residues) are greater than 70%. These results exceed the best results of other sequence-based prediction methods. (C) 2005 Wiley-Liss, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bio-systems are inherently complex information processing systems. Furthermore, physiological complexities of biological systems limit the formation of a hypothesis in terms of behavior and the ability to test hypothesis. More importantly the identification and classification of mutation in patients are centric topics in today's cancer research. Next generation sequencing (NGS) technologies can provide genome-wide coverage at a single nucleotide resolution and at reasonable speed and cost. The unprecedented molecular characterization provided by NGS offers the potential for an individualized approach to treatment. These advances in cancer genomics have enabled scientists to interrogate cancer-specific genomic variants and compare them with the normal variants in the same patient. Analysis of this data provides a catalog of somatic variants, present in tumor genome but not in the normal tissue DNA. In this dissertation, we present a new computational framework to the problem of predicting the number of mutations on a chromosome for a certain patient, which is a fundamental problem in clinical and research fields. We begin this dissertation with the development of a framework system that is capable of utilizing published data from a longitudinal study of patients with acute myeloid leukemia (AML), who's DNA from both normal as well as malignant tissues was subjected to NGS analysis at various points in time. By processing the sequencing data at the time of cancer diagnosis using the components of our framework, we tested it by predicting the genomic regions to be mutated at the time of relapse and, later, by comparing our results with the actual regions that showed mutations (discovered at relapse time). We demonstrate that this coupling of the algorithm pipeline can drastically improve the predictive abilities of searching a reliable molecular signature. Arguably, the most important result of our research is its superior performance to other methods like Radial Basis Function Network, Sequential Minimal Optimization, and Gaussian Process. In the final part of this dissertation, we present a detailed significance, stability and statistical analysis of our model. A performance comparison of the results are presented. This work clearly lays a good foundation for future research for other types of cancer.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Medical imaging technology and applications are continuously evolving, dealing with images of increasing spatial and temporal resolutions, which allow easier and more accurate medical diagnosis. However, this increase in resolution demands a growing amount of data to be stored and transmitted. Despite the high coding efficiency achieved by the most recent image and video coding standards in lossy compression, they are not well suited for quality-critical medical image compression where either near-lossless or lossless coding is required. In this dissertation, two different approaches to improve lossless coding of volumetric medical images, such as Magnetic Resonance and Computed Tomography, were studied and implemented using the latest standard High Efficiency Video Encoder (HEVC). In a first approach, the use of geometric transformations to perform inter-slice prediction was investigated. For the second approach, a pixel-wise prediction technique, based on Least-Squares prediction, that exploits inter-slice redundancy was proposed to extend the current HEVC lossless tools. Experimental results show a bitrate reduction between 45% and 49%, when compared with DICOM recommended encoders, and 13.7% when compared with standard HEVC.