950 resultados para density surface modelling


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mixture Density Networks are a principled method to model conditional probability density functions which are non-Gaussian. This is achieved by modelling the conditional distribution for each pattern with a Gaussian Mixture Model for which the parameters are generated by a neural network. This thesis presents a novel method to introduce regularisation in this context for the special case where the mean and variance of the spherical Gaussian Kernels in the mixtures are fixed to predetermined values. Guidelines for how these parameters can be initialised are given, and it is shown how to apply the evidence framework to mixture density networks to achieve regularisation. This also provides an objective stopping criteria that can replace the `early stopping' methods that have previously been used. If the neural network used is an RBF network with fixed centres this opens up new opportunities for improved initialisation of the network weights, which are exploited to start training relatively close to the optimum. The new method is demonstrated on two data sets. The first is a simple synthetic data set while the second is a real life data set, namely satellite scatterometer data used to infer the wind speed and wind direction near the ocean surface. For both data sets the regularisation method performs well in comparison with earlier published results. Ideas on how the constraint on the kernels may be relaxed to allow fully adaptable kernels are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Issues of wear and tribology are increasingly important in computer hard drives as slider flying heights are becoming lower and disk protective coatings thinner to minimise spacing loss and allow higher areal density. Friction, stiction and wear between the slider and disk in a hard drive were studied using Accelerated Friction Test (AFT) apparatus. Contact Start Stop (CSS) and constant speed drag tests were performed using commercial rigid disks and two different air bearing slider types. Friction and stiction were captured during testing by a set of strain gauges. System parameters were varied to investigate their effect on tribology at the head/disk interface. Chosen parameters were disk spinning velocity, slider fly height, temperature, humidity and intercycle pause. The effect of different disk texturing methods was also studied. Models were proposed to explain the influence of these parameters on tribology. Atomic Force Microscopy (AFM) and Scanning Electron Microscopy (SEM) were used to study head and disk topography at various test stages and to provide physical parameters to verify the models. X-ray Photoelectron Spectroscopy (XPS) was employed to identify surface composition and determine if any chemical changes had occurred as a result of testing. The parameters most likely to influence the interface were identified for both CSS and drag testing. Neural Network modelling was used to substantiate results. Topographical AFM scans of disk and slider were exported numerically to file and explored extensively. Techniques were developed which improved line and area analysis. A method for detecting surface contacts was also deduced, results supported and explained observed AFT behaviour. Finally surfaces were computer generated to simulate real disk scans, this allowed contact analysis of many types of surface to be performed. Conclusions were drawn about what disk characteristics most affected contacts and hence friction, stiction and wear.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In nonlinear and stochastic control problems, learning an efficient feed-forward controller is not amenable to conventional neurocontrol methods. For these approaches, estimating and then incorporating uncertainty in the controller and feed-forward models can produce more robust control results. Here, we introduce a novel inversion-based neurocontroller for solving control problems involving uncertain nonlinear systems which could also compensate for multi-valued systems. The approach uses recent developments in neural networks, especially in the context of modelling statistical distributions, which are applied to forward and inverse plant models. Provided that certain conditions are met, an estimate of the intrinsic uncertainty for the outputs of neural networks can be obtained using the statistical properties of networks. More generally, multicomponent distributions can be modelled by the mixture density network. Based on importance sampling from these distributions a novel robust inverse control approach is obtained. This importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localise the possible control solutions to consider. A nonlinear multi-variable system with different delays between the input-output pairs is used to demonstrate the successful application of the developed control algorithm. The proposed method is suitable for redundant control systems and allows us to model strongly non-Gaussian distributions of control signal as well as processes with hysteresis. © 2004 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The cell:cell bond between an immune cell and an antigen presenting cell is a necessary event in the activation of the adaptive immune response. At the juncture between the cells, cell surface molecules on the opposing cells form non-covalent bonds and a distinct patterning is observed that is termed the immunological synapse. An important binding molecule in the synapse is the T-cell receptor (TCR), that is responsible for antigen recognition through its binding with a major-histocompatibility complex with bound peptide (pMHC). This bond leads to intracellular signalling events that culminate in the activation of the T-cell, and ultimately leads to the expression of the immune eector function. The temporal analysis of the TCR bonds during the formation of the immunological synapse presents a problem to biologists, due to the spatio-temporal scales (nanometers and picoseconds) that compare with experimental uncertainty limits. In this study, a linear stochastic model, derived from a nonlinear model of the synapse, is used to analyse the temporal dynamics of the bond attachments for the TCR. Mathematical analysis and numerical methods are employed to analyse the qualitative dynamics of the nonequilibrium membrane dynamics, with the specic aim of calculating the average persistence time for the TCR:pMHC bond. A single-threshold method, that has been previously used to successfully calculate the TCR:pMHC contact path sizes in the synapse, is applied to produce results for the average contact times of the TCR:pMHC bonds. This method is extended through the development of a two-threshold method, that produces results suggesting the average time persistence for the TCR:pMHC bond is in the order of 2-4 seconds, values that agree with experimental evidence for TCR signalling. The study reveals two distinct scaling regimes in the time persistent survival probability density prole of these bonds, one dominated by thermal uctuations and the other associated with the TCR signalling. Analysis of the thermal fluctuation regime reveals a minimal contribution to the average time persistence calculation, that has an important biological implication when comparing the probabilistic models to experimental evidence. In cases where only a few statistics can be gathered from experimental conditions, the results are unlikely to match the probabilistic predictions. The results also identify a rescaling relationship between the thermal noise and the bond length, suggesting a recalibration of the experimental conditions, to adhere to this scaling relationship, will enable biologists to identify the start of the signalling regime for previously unobserved receptor:ligand bonds. Also, the regime associated with TCR signalling exhibits a universal decay rate for the persistence probability, that is independent of the bond length.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Quantitative information on metazoan meiofaunal abundance and biomass was obtained from three continental shelf (at 40, 100 and 200 m depth) and four deep-sea stations (at 540, 700, 940 and 1540 m depth) in the Cretan Sea (South Aegean Sea, NE Mediterranean). Samples were collected on a seasonal basis (from August 1994 to September 1995) with the use of a multiple corer. Meiofaunal abundance and biomass on the continental shelf of the Cretan Sea were high, in contrast to the extremely low values reported for the bathyal sediments that showed values comparable to those reported for abyssal and hadal environments. In order to explain the spatial and seasonal changes in metazoan meiofauna these data were compared with: (1) the concentrations of 'food indicators' (such as proteins, lipids, soluble carbohydrates and CPE) (2) the bacterial biomass (3) the flux of labile organic compounds to the sea floor at a fixed station (D7, 1540 m depth). Highly significant relationships between meiofaunal parameters and CPE, protein and lipid concentrations and bacterial biomass were found. Most of the indicators of food quality and quantity (such as CPE, proteins and carbohydrates) showed a clear seasonality with highest values in February and lowest in September. Such changes were more evident on the continental shelf rather than at deeper depths. On the continental shelf, significant seasonal changes in meiofaunal density were related to changes in the input of labile organic carbon whereas meiofaunal assemblages on the deep-sea stations showed time-lagged changes in response to the food input recorded in February 95. At all deep-sea stations meiofaunal density increased with a time lag of 2 months. Indications for a time-lagged meiofaunal response to the food inputs were also provided by the increase in nauplii densities during May 95 and the increase in individual biomass of nematodes, copepods and polychaetes between February and May 1995. The lack of strong seasonal changes in deep sea meiofaunal density suggests that the supply of organic matter below 500 m is not strong enough to support a significant meiofaunal development. Below 700 m depth >92% of the total biomass in the sediment was represented by bacteria. The ratio of bacterial to meiofaunal biomass increased with increasing water depth indicating that bacteria are probably more effective than meiofauna in exploiting refractory organic compounds. These data lead us to hypothesise that the deep-sea sediments of the Cretan Sea are largely dependent upon a benthic microbial loop.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In northern regions where observational data is sparse, lake ice models are ideal tools as they can provide valuable information on ice cover regimes. The Canadian Lake Ice Model was used to simulate ice cover for a lake near Churchill, Manitoba, Canada throughout the 2008/2009 and 2009/2010 ice covered seasons. To validate and improve the model results, in situ measurements of the ice cover through both seasons were obtained using an upward-looking sonar device Shallow Water Ice Profiler (SWIP) installed on the bottom of the lake. The SWIP identified the ice-on/off dates as well as collected ice thickness measurements. In addition, a digital camera was installed on shore to capture images of the ice cover through the seasons and field measurements were obtained of snow depth on the ice, and both the thickness of snow ice (if present) and total ice cover. Altering the amounts of snow cover on the ice surface to represent potential snow redistribution affected simulated freeze-up dates by a maximum of 22 days and break-up dates by a maximum of 12 days, highlighting the importance of accurately representing the snowpack for lake ice modelling. The late season ice thickness tended to be under estimated by the simulations with break-up occurring too early, however, the evolution of the ice cover was simulated to fall between the range of the full snow and no snow scenario, with the thickness being dependant on the amount of snow cover on the ice surface.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Tris(2-ethylhexyl) trimellitate (TOTM) was recently suggested as a reference fluid for industrial use associated with high viscosity at elevated temperature and pressure. Viscosity and density data have already been published on one sample covering the temperature range (303-373) K and at pressures up to about 65 MPa. The viscosity covered a range from about (9 to 460) mPa s. In the present article we study several other characteristics of TOTM that must be available if it were to be adopted as a standard. First, we present values for the viscosity and density obtained with a different sample of TOTM to examine the important feature of consistency among different samples. Vibrating-wire viscosity measurements were performed at pressures from (5 to 100) MPa, along 6 isotherms between (303 and 373) K. Density measurements were carried out from (293 to 373) K up to 68 MPa, along 4 isotherms, using an Anton Paar DMA HP vibrating U-tube densimeter. Secondly, we report a study of the effect of water contamination on the viscosity of TOTM, performed using an Ubbelhode viscometer under atmospheric pressure. Finally, in order to support the use of TOTM as a reference liquid for the calibration of capillary viscometers, values of its surface tension, obtained by the pendant drop method, are provided. (C) 2016 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Satellites have great potential for diagnosis of surface air quality conditions, though reduced sensitivity of satellite instrumentation to the lower troposphere currently impedes their applicability. One objective of the NASA DISCOVER-AQ project is to provide information relevant to improving our ability to relate satellite-observed columns to surface conditions for key trace gases and aerosols. In support of DISCOVER-AQ, this dissertation investigates the degree of correlation between O3 and NO2 column abundance and surface mixing ratio during the four DISCOVER-AQ deployments; characterize the variability of the aircraft in situ and model-simulated O3 and NO2 profiles; and use the WRF-Chem model to further investigate the role of boundary layer mixing in the column-surface connection for the Maryland 2011 deployment, and determine which of the available boundary layer schemes best captures the observations. Simple linear regression analyses suggest that O3 partial column observations from future satellite instruments with sufficient sensitivity to the lower troposphere may be most meaningful for surface air quality under the conditions associated with the Maryland 2011 campaign, which included generally deep, convective boundary layers, the least wind shear of all four deployments, and few geographical influences on local meteorology, with exception of bay breezes. Hierarchical clustering analysis of the in situ O3 and NO2 profiles indicate that the degree of vertical mixing (defined by temperature lapse rate) associated with each cluster exerted an important influence on the shapes of the median cluster profiles for O3, as well as impacted the column vs. surface correlations for many clusters for both O3 and NO2. However, comparisons to the CMAQ model suggest that, among other errors, vertical mixing is overestimated, causing too great a column-surface connection within the model. Finally, the WRF-Chem model, a meteorology model with coupled chemistry, is used to further investigate the impact of vertical mixing on the O3 and NO2 column-surface connection, for an ozone pollution event that occurred on July 26-29, 2011. Five PBL schemes were tested, with no one scheme producing a clear, consistent “best” comparison with the observations for PBLH and pollutant profiles; however, despite improvements, the ACM2 scheme continues to overestimate vertical mixing.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Observing, modelling and understanding the climate-scale variability of the deep water formation (DWF) in the North-Western Mediterranean Sea remains today very challenging. In this study, we first characterize the interannual variability of this phenomenon by a thorough reanalysis of observations in order to establish reference time series. These quantitative indicators include 31 observed years for the yearly maximum mixed layer depth over the period 1980–2013 and a detailed multi-indicator description of the period 2007–2013. Then a 1980–2013 hindcast simulation is performed with a fully-coupled regional climate system model including the high-resolution representation of the regional atmosphere, ocean, land-surface and rivers. The simulation reproduces quantitatively well the mean behaviour and the large interannual variability of the DWF phenomenon. The model shows convection deeper than 1000 m in 2/3 of the modelled winters, a mean DWF rate equal to 0.35 Sv with maximum values of 1.7 (resp. 1.6) Sv in 2013 (resp. 2005). Using the model results, the winter-integrated buoyancy loss over the Gulf of Lions is identified as the primary driving factor of the DWF interannual variability and explains, alone, around 50 % of its variance. It is itself explained by the occurrence of few stormy days during winter. At daily scale, the Atlantic ridge weather regime is identified as favourable to strong buoyancy losses and therefore DWF, whereas the positive phase of the North Atlantic oscillation is unfavourable. The driving role of the vertical stratification in autumn, a measure of the water column inhibition to mixing, has also been analyzed. Combining both driving factors allows to explain more than 70 % of the interannual variance of the phenomenon and in particular the occurrence of the five strongest convective years of the model (1981, 1999, 2005, 2009, 2013). The model simulates qualitatively well the trends in the deep waters (warming, saltening, increase in the dense water volume, increase in the bottom water density) despite an underestimation of the salinity and density trends. These deep trends come from a heat and salt accumulation during the 1980s and the 1990s in the surface and intermediate layers of the Gulf of Lions before being transferred stepwise towards the deep layers when very convective years occur in 1999 and later. The salinity increase in the near Atlantic Ocean surface layers seems to be the external forcing that finally leads to these deep trends. In the future, our results may allow to better understand the behaviour of the DWF phenomenon in Mediterranean Sea simulations in hindcast, forecast, reanalysis or future climate change scenario modes. The robustness of the obtained results must be however confirmed in multi-model studies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Doutoramento em Engenharia Florestal e dos Recursos Naturais - Instituto Superior de Agronomia - UL

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new approach is described herein, where neutron reflectivity measurements that probe changes in the density profile of thin films as they absorb material from the gas phase have been combined with a Love wave based gravimetric assay that measures the mass of absorbed material. This combination of techniques not only determines the spatial distribution of absorbed molecules, but also reveals the amount of void space within the thin film (a quantity that can be difficult to assess using neutron reflectivity measurements alone). The uptake of organic solvent vapours into spun cast films of polystyrene has been used as a model system with a view to this method having the potential for extension to the study of other systems. These could include, for example, humidity sensors, hydrogel swelling, biomolecule adsorption or transformations of electroactive and chemically reactive thin films. This is the first ever demonstration of combined neutron reflectivity and Love wave-based gravimetry and the experimental caveats, limitations and scope of the method are explored and discussed in detail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Response surface methodology based on Box-Behnken (BBD) design was successfully applied to the optimization in the operating conditions of the electrochemical oxidation of sanitary landfill leachate aimed for making this method feasible for scale up. Landfill leachate was treated in continuous batch-recirculation system, where a dimensional stable anode (DSA(©)) coated with Ti/TiO2 and RuO2 film oxide were used. The effects of three variables, current density (milliampere per square centimeter), time of treatment (minutes), and supporting electrolyte dosage (moles per liter) upon the total organic carbon removal were evaluated. Optimized conditions were obtained for the highest desirability at 244.11 mA/cm(2), 41.78 min, and 0.07 mol/L of NaCl and 242.84 mA/cm(2), 37.07 min, and 0.07 mol/L of Na2SO4. Under the optimal conditions, 54.99 % of chemical oxygen demand (COD) and 71.07 ammonia nitrogen (NH3-N) removal was achieved with NaCl and 45.50 of COD and 62.13 NH3-N with Na2SO4. A new kinetic model predicted obtained from the relation between BBD and the kinetic model was suggested.