939 resultados para process parameter monitoring


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Camera traps are used to estimate densities or abundances using capture-recapture and, more recently, random encounter models (REMs). We deploy REMs to describe an invasive-native species replacement process, and to demonstrate their wider application beyond abundance estimation. The Irish hare Lepus timidus hibernicus is a high priority endemic of conservation concern. It is threatened by an expanding population of non-native, European hares L. europaeus, an invasive species of global importance. Camera traps were deployed in thirteen 1 km squares, wherein the ratio of invader to native densities were corroborated by night-driven line transect distance sampling throughout the study area of 1652 km2. Spatial patterns of invasive and native densities between the invader’s core and peripheral ranges, and native allopatry, were comparable between methods. Native densities in the peripheral range were comparable to those in native allopatry using REM, or marginally depressed using Distance Sampling. Numbers of the invader were substantially higher than the native in the core range, irrespective of method, with a 5:1 invader-to-native ratio indicating species replacement. We also describe a post hoc optimization protocol for REM which will inform subsequent (re-)surveys, allowing survey effort (camera hours) to be reduced by up to 57% without compromising the width of confidence intervals associated with density estimates. This approach will form the basis of a more cost-effective means of surveillance and monitoring for both the endemic and invasive species. The European hare undoubtedly represents a significant threat to the endemic Irish hare.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a robust Dirichlet process for estimating survival functions from samples with right-censored data. It adopts a prior near-ignorance approach to avoid almost any assumption about the distribution of the population lifetimes, as well as the need of eliciting an infinite dimensional parameter (in case of lack of prior information), as it happens with the usual Dirichlet process prior. We show how such model can be used to derive robust inferences from right-censored lifetime data. Robustness is due to the identification of the decisions that are prior-dependent, and can be interpreted as an analysis of sensitivity with respect to the hypothetical inclusion of fictitious new samples in the data. In particular, we derive a nonparametric estimator of the survival probability and a hypothesis test about the probability that the lifetime of an individual from one population is shorter than the lifetime of an individual from another. We evaluate these ideas on simulated data and on the Australian AIDS survival dataset. The methods are publicly available through an easy-to-use R package.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The UK’s transportation network is supported by critical geotechnical assets (cuttings/embankments/dams) that require sustainable, cost-effective management, while maintaining an appropriate service level to meet social, economic, and environmental needs. Recent effects of extreme weather on these geotechnical assets have highlighted their vulnerability to climate variations. We have assessed the potential of surface wave data to portray the climate-related variations in mechanical properties of a clay-filled railway embankment. Seismic data were acquired bimonthly from July 2013 to November 2014 along the crest of a heritage railway embankment in southwest England. For each acquisition, the collected data were first processed to obtain a set of Rayleigh-wave dispersion and attenuation curves, referenced to the same spatial locations. These data were then analyzed to identify a coherent trend in their spatial and temporal variability. The relevance of the observed temporal variations was also verified with respect to the experimental data uncertainties. Finally, the surface wave dispersion data sets were inverted to reconstruct a time-lapse model of S-wave velocity for the embankment structure, using a least-squares laterally constrained inversion scheme. A key point of the inversion process was constituted by the estimation of a suitable initial model and the selection of adequate levels of spatial regularization. The initial model and the strength of spatial smoothing were then kept constant throughout the processing of all available data sets to ensure homogeneity of the procedure and comparability among the obtained VS sections. A continuous and coherent temporal pattern of surface wave data, and consequently of the reconstructed VS models, was identified. This pattern is related to the seasonal distribution of precipitation and soil water content measured on site.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time-domain modelling of single-reed woodwind instruments usually involves a lumped model of the excitation mechanism. The parameters of this lumped model have to be estimated for use in numerical simulations. Several attempts have been made to estimate these parameters, including observations of the mechanics of isolated reeds, measurements under artificial or real playing conditions and estimations based on numerical simulations. In this study an optimisation routine is presented, that can estimate reed-model parameters, given the pressure and flow signals in the mouthpiece. The method is validated, tested on a series of numerically synthesised data. In order to incorporate the actions of the player in the parameter estimation process, the optimisation routine has to be applied to signals obtained under real playing conditions. The estimated parameters can then be used to resynthesise the pressure and flow signals in the mouthpiece. In the case of measured data, as opposed to numerically synthesised data, special care needs to be taken while modelling the bore of the instrument. In fact, a careful study of various experimental datasets revealed that for resynthesis to work, the bore termination impedance should be known very precisely from theory. An example is given, where the above requirement is satisfied, and the resynthesised signals closely match the original signals generated by the player.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Por parte da indústria de estampagem tem-se verificado um interesse crescente em simulações numéricas de processos de conformação de chapa, incluindo também métodos de engenharia inversa. Este facto ocorre principalmente porque as técnicas de tentativa-erro, muito usadas no passado, não são mais competitivas a nível económico. O uso de códigos de simulação é, atualmente, uma prática corrente em ambiente industrial, pois os resultados tipicamente obtidos através de códigos com base no Método dos Elementos Finitos (MEF) são bem aceites pelas comunidades industriais e científicas Na tentativa de obter campos de tensão e de deformação precisos, uma análise eficiente com o MEF necessita de dados de entrada corretos, como geometrias, malhas, leis de comportamento não-lineares, carregamentos, leis de atrito, etc.. Com o objetivo de ultrapassar estas dificuldades podem ser considerados os problemas inversos. No trabalho apresentado, os seguintes problemas inversos, em Mecânica computacional, são apresentados e analisados: (i) problemas de identificação de parâmetros, que se referem à determinação de parâmetros de entrada que serão posteriormente usados em modelos constitutivos nas simulações numéricas e (ii) problemas de definição geométrica inicial de chapas e ferramentas, nos quais o objetivo é determinar a forma inicial de uma chapa ou de uma ferramenta tendo em vista a obtenção de uma determinada geometria após um processo de conformação. São introduzidas e implementadas novas estratégias de otimização, as quais conduzem a parâmetros de modelos constitutivos mais precisos. O objetivo destas estratégias é tirar vantagem das potencialidades de cada algoritmo e melhorar a eficiência geral dos métodos clássicos de otimização, os quais são baseados em processos de apenas um estágio. Algoritmos determinísticos, algoritmos inspirados em processos evolucionários ou mesmo a combinação destes dois são usados nas estratégias propostas. Estratégias de cascata, paralelas e híbridas são apresentadas em detalhe, sendo que as estratégias híbridas consistem na combinação de estratégias em cascata e paralelas. São apresentados e analisados dois métodos distintos para a avaliação da função objetivo em processos de identificação de parâmetros. Os métodos considerados são uma análise com um ponto único ou uma análise com elementos finitos. A avaliação com base num único ponto caracteriza uma quantidade infinitesimal de material sujeito a uma determinada história de deformação. Por outro lado, na análise através de elementos finitos, o modelo constitutivo é implementado e considerado para cada ponto de integração. Problemas inversos são apresentados e descritos, como por exemplo, a definição geométrica de chapas e ferramentas. Considerando o caso da otimização da forma inicial de uma chapa metálica a definição da forma inicial de uma chapa para a conformação de um elemento de cárter é considerado como problema em estudo. Ainda neste âmbito, um estudo sobre a influência da definição geométrica inicial da chapa no processo de otimização é efetuado. Este estudo é realizado considerando a formulação de NURBS na definição da face superior da chapa metálica, face cuja geometria será alterada durante o processo de conformação plástica. No caso dos processos de otimização de ferramentas, um processo de forjamento a dois estágios é apresentado. Com o objetivo de obter um cilindro perfeito após o forjamento, dois métodos distintos são considerados. No primeiro, a forma inicial do cilindro é otimizada e no outro a forma da ferramenta do primeiro estágio de conformação é otimizada. Para parametrizar a superfície livre do cilindro são utilizados diferentes métodos. Para a definição da ferramenta são também utilizados diferentes parametrizações. As estratégias de otimização propostas neste trabalho resolvem eficientemente problemas de otimização para a indústria de conformação metálica.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monitoring of coastal and estuarine water quality has been traditionally performed by sampling with subsequent laboratory analysis. This has the disadvantages of low spatial and temporal resolution and high cost. In the last decades two alternative techniques have emerged to overcome this drawback: profiling and remote sensing. Profiling using multi-parameter sensors is now in a commercial stage. It can be used, tied to a boat, to obtain a quick “picture” of the system. The spatial resolution thus increases from single points to a line coincident with the boat track. The temporal resolution however remains unchanged since campaigns and resources involved are basically the same. The need for laboratory analysis was reduced but not eliminated because parameters like nutrients, microbiology or metals are still difficult to obtain with sensors and validation measurements are still needed. In the last years the improvement in satellite resolution has enabled its use for coastal and estuarine water monitoring. Although spatial coverage and resolution of satellite images in the present is already suitable to coastal and estuarine monitoring, temporal resolution is naturally limited to satellite passages and cloud cover. With this panorama the best approach to water monitoring is to integrate and combine data from all these sources. The natural tools to perform this integration are numerical models. Models benefit from the different sources of data to obtain a better calibration. After calibration they can be used to extend spatially and temporally the methods resolution. In Algarve (South of Portugal) a monitoring effort using this approach is being undertaken. The monitoring effort comprises five different locations including coastal waters, estuaries and coastal lagoons. The objective is to establish the base line situation to evaluate the impact of Waste Water Treatment Plants design and retrofitting. The field campaigns include monthly synoptic profiling, using an YSI 6600 multi-parameter system, laboratory analysis and fixed stations. The remote sensing uses ENVISAT\MERIS Level 2 Full Resolution data. This data is combined and used with the MOHID modelling system to obtain an integrate description of the systems. The results show the limitations of each method and the ability of the modelling system to integrate the results and to produce a comprehensive picture of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese dout., Química, Universidade do Algarve, 2005

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de doutoramento (co-tutela), Psicologia (Psicologia da Educação), Faculdade de Psicologia da Universidade de Lisboa, Faculdade de Psicologia e de Ciências da Educação da Universidade de Coimbra, Technial University of Darmstadt, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a low complexity high efficiency decimation filter which can be employed in EletroCardioGram (ECG) acquisition systems. The decimation filter with a decimation ratio of 128 works along with a third order sigma delta modulator. It is designed in four stages to reduce cost and power consumption. The work reported here provides an efficient approach for the decimation process for high resolution biomedical data conversion applications by employing low complexity two-path all-pass based decimation filters. The performance of the proposed decimation chain was validated by using the MIT-BIH arrhythmia database and comparative simulations were conducted with the state of the art.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil vapor extraction (SVE) and bioremediation (BR) are two of the most common soil remediation technologies. Their application is widespread; however, both present limitations, namely related to the efficiencies of SVE on organic soils and to the remediation times of some BR processes. This work aimed to study the combination of these two technologies in order to verify the achievement of the legal clean-up goals in soil remediation projects involving seven different simulated soils separately contaminated with toluene and xylene. The remediations consisted of the application of SVE followed by biostimulation. The results show that the combination of these two technologies is effective and manages to achieve the clean-up goals imposed by the Spanish Legislation. Under the experimental conditions used in this work, SVE is sufficient for the remediation of soils, contaminated separately with toluene and xylene, with organic matter contents (OMC) below 4 %. In soils with higher OMC, the use of BR, as a complementary technology, and when the concentration of contaminant in the gas phase of the soil reaches values near 1 mg/L, allows the achievement of the clean-up goals. The OMC was a key parameter because it hindered SVE due to adsorption phenomena but enhanced the BR process because it acted as a microorganism and nutrient source.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre Em Engenharia Química e Biológica Ramo de processos Químicos

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Waste oil recycling companies play a very important role in our society. Competition among companies is tough and process optimization is essential for survival. By equipping oil containers with a level monitoring system that periodically reports the level and alerts when it reaches the preset threshold, the oil recycling companies are able to streamline the oil collection process and, thus, reduce the operation costs while maintaining the quality of service. This paper describes the development of this level monitoring system by a team of four students from different engineering backgrounds and nationalities. The team conducted a study of the state of the art, draw marketing and sustainable development plans and, finally, designed and implemented a prototype that continuously measures the container content level and sends an alert message as soon as it reaches the preset capacity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematical models and statistical analysis are key instruments in soil science scientific research as they can describe and/or predict the current state of a soil system. These tools allow us to explore the behavior of soil related processes and properties as well as to generate new hypotheses for future experimentation. A good model and analysis of soil properties variations, that permit us to extract suitable conclusions and estimating spatially correlated variables at unsampled locations, is clearly dependent on the amount and quality of data and of the robustness techniques and estimators. On the other hand, the quality of data is obviously dependent from a competent data collection procedure and from a capable laboratory analytical work. Following the standard soil sampling protocols available, soil samples should be collected according to key points such as a convenient spatial scale, landscape homogeneity (or non-homogeneity), land color, soil texture, land slope, land solar exposition. Obtaining good quality data from forest soils is predictably expensive as it is labor intensive and demands many manpower and equipment both in field work and in laboratory analysis. Also, the sampling collection scheme that should be used on a data collection procedure in forest field is not simple to design as the sampling strategies chosen are strongly dependent on soil taxonomy. In fact, a sampling grid will not be able to be followed if rocks at the predicted collecting depth are found, or no soil at all is found, or large trees bar the soil collection. Considering this, a proficient design of a soil data sampling campaign in forest field is not always a simple process and sometimes represents a truly huge challenge. In this work, we present some difficulties that have occurred during two experiments on forest soil that were conducted in order to study the spatial variation of some soil physical-chemical properties. Two different sampling protocols were considered for monitoring two types of forest soils located in NW Portugal: umbric regosol and lithosol. Two different equipments for sampling collection were also used: a manual auger and a shovel. Both scenarios were analyzed and the results achieved have allowed us to consider that monitoring forest soil in order to do some mathematical and statistical investigations needs a sampling procedure to data collection compatible to established protocols but a pre-defined grid assumption often fail when the variability of the soil property is not uniform in space. In this case, sampling grid should be conveniently adapted from one part of the landscape to another and this fact should be taken into consideration of a mathematical procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Oncological treatments are traditionally administered via intravenous injection by qualified personnel. Oral formulas which are developing rapidly are preferred by patients and facilitate administration however they may increase non-adherence. In this study 4 common oral chemotherapeutics are given to 50 patients, who are still in the process of inclusion, divided into 4 groups. The aim is to evaluate adherence and offer these patients interdisciplinary support with the joint help of doctors and pharmacists. We present here the results for capecitabine. Materials and Methods: The final goal is to evaluate adhesion in 50 patients split into 4 groups according to oral treatments (letrozole/exemestane, imatinib/sunitinib, capecitabine and temozolomide) using persistence and quality of execution as parameters. These parameters are evaluated using a medication event monitoring system (MEMS®) in addition to routine oncological visits and semi-structured interviews. Patients were monitored for the entire duration of treatment up to a maximum of 1 year. Patient satisfaction was assessed at the end of the monitoring period using a standardized questionary. Results: Capecitabine group included 2 women and 8 men with a median age of 55 years (range: 36−77 years) monitored for an average duration of 100 days (range: 5-210 days). Persistence was 98% and quality of execution 95%. 5 patients underwent cyclic treatment (2 out of 3 weeks) and 5 patients continuous treatment. Toxicities higher than grade 1 were grade 2−3 hand-foot syndrome in 1 patient and grade 3 acute coronary syndrome in 1 patient both without impact on adherence. Patients were satisfied with the interviews undergone during the study (57% useful, 28% very useful, 15% useless) and successfully integrated the MEMS® in their daily lives (57% very easily, 43% easily) according to the results obtained by questionary at the end of the monitoring period. Conclusion: Persistence and quality of execution observed in our Capecitabine group of patients were excellent and better than expected compared to previously published studies. The interdisciplinary approach allowed us to better identify and help patients with toxicities to maintain adherence. Overall patients were satisfied with the global interdisciplinary follow-up. With longer follow up better evaluation of our method and its impact will be possible. Interpretation of the results of patients in the other groups of this ongoing trial will provide us information for a more detailed analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Infantile haemangiomas (IHs) are very common vascular tumours. Propranolol is at present the first-line treatment for problematic and complicated haemangioma. In accordance with a Swiss protocol, children are monitored for 2 days at the start of the treatment to detect possible side effects of this drug. Our study advocates a simplification of the pretreatment monitoring process. METHODS: All children with a problematic and complicated haemangioma treated with propranolol between September 2009 and September 2012 were included in the study. All patients were hospitalised under constant nurse supervision for 48 hours at the start of the treatment and subjected to cardiac and blood measurements. The dosage of propranolol was 1 mg/kg/day on the first day and 2 mg/kg/day from the second day. Demographic data, clinical features, treatment outcome and complications were analysed. RESULTS: Twenty-nine infants were included in our study. Of these, 86.2% responded immediately to the treatment. There were no severe adverse reactions. Six patients presented transient side effects such as bradycardia, hypotension after the first dose and hypoglycaemia later. No side effects occurred after the second dose. Treatment was never interrupted. CONCLUSION: Propranolol (a β-blocker) is a safe treatment for problematic IH. Side effects may occur after the first dose. A strict 48 hour monitoring in hospital is expensive and may be unnecessary as long as the contraindications for the drug are respected.