919 resultados para Validation method
Resumo:
A new type of space debris was recently discovered by Schildknecht in near -geosynchronous orbit (GEO). These objects were later identified as exhibiting properties associated with High Area-to-Mass ratio (HAMR) objects. According to their brightness magnitudes (light curve), high rotation rates and composition properties (albedo, amount of specular and diffuse reflection, colour, etc), it is thought that these objects are multilayer insulation (MLI). Observations have shown that this debris type is very sensitive to environmental disturbances, particularly solar radiation pressure, due to the fact that their shapes are easily deformed leading to changes in the Area-to-Mass ratio (AMR) over time. This thesis proposes a simple effective flexible model of the thin, deformable membrane with two different methods. Firstly, this debris is modelled with Finite Element Analysis (FEA) by using Bernoulli-Euler theory called “Bernoulli model”. The Bernoulli model is constructed with beam elements consisting 2 nodes and each node has six degrees of freedom (DoF). The mass of membrane is distributed in beam elements. Secondly, the debris based on multibody dynamics theory call “Multibody model” is modelled as a series of lump masses, connected through flexible joints, representing the flexibility of the membrane itself. The mass of the membrane, albeit low, is taken into account with lump masses in the joints. The dynamic equations for the masses, including the constraints defined by the connecting rigid rod, are derived using fundamental Newtonian mechanics. The physical properties of both flexible models required by the models (membrane density, reflectivity, composition, etc.), are assumed to be those of multilayer insulation. Both flexible membrane models are then propagated together with classical orbital and attitude equations of motion near GEO region to predict the orbital evolution under the perturbations of solar radiation pressure, Earth’s gravity field, luni-solar gravitational fields and self-shadowing effect. These results are then compared to two rigid body models (cannonball and flat rigid plate). In this investigation, when comparing with a rigid model, the evolutions of orbital elements of the flexible models indicate the difference of inclination and secular eccentricity evolutions, rapid irregular attitude motion and unstable cross-section area due to a deformation over time. Then, the Monte Carlo simulations by varying initial attitude dynamics and deformed angle are investigated and compared with rigid models over 100 days. As the results of the simulations, the different initial conditions provide unique orbital motions, which is significantly different in term of orbital motions of both rigid models. Furthermore, this thesis presents a methodology to determine the material dynamic properties of thin membranes and validates the deformation of the multibody model with real MLI materials. Experiments are performed in a high vacuum chamber (10-4 mbar) replicating space environment. A thin membrane is hinged at one end but free at the other. The free motion experiment, the first experiment, is a free vibration test to determine the damping coefficient and natural frequency of the thin membrane. In this test, the membrane is allowed to fall freely in the chamber with the motion tracked and captured through high velocity video frames. A Kalman filter technique is implemented in the tracking algorithm to reduce noise and increase the tracking accuracy of the oscillating motion. The forced motion experiment, the last test, is performed to determine the deformation characteristics of the object. A high power spotlight (500-2000W) is used to illuminate the MLI and the displacements are measured by means of a high resolution laser sensor. Finite Element Analysis (FEA) and multibody dynamics of the experimental setups are used for the validation of the flexible model by comparing with the experimental results of displacements and natural frequencies.
Resumo:
Objective: To analyze pharmaceutical interventions that have been carried out with the support of an automated system for validation of treatments vs. the traditional method without computer support. Method: The automated program, ALTOMEDICAMENTOS® version 0, has 925 052 data with information regarding approximately 20 000 medicines, analyzing doses, administration routes, number of days with such a treatment, dosing in renal and liver failure, interactions control, similar drugs, and enteral medicines. During eight days, in four different hospitals (high complexity with over 1 000 beds, 400-bed intermediate, geriatric and monographic), the same patients and treatments were analyzed using both systems. Results: 3,490 patients were analyzed, with 42 155 different treatments. 238 interventions were performed using the traditional system (interventions 0.56% / possible interventions) vs. 580 (1.38%) with the automated one. Very significant pharmaceutical interventions were 0.14% vs. 0.46%; significant was 0.38% vs. 0.90%; non-significant was 0.05% vs. 0.01%, respectively. If both systems are simultaneously used, interventions are performed in 1.85% vs. 0.56% with just the traditional system. Using only the traditional model, 30.5% of the possible interventions are detected, whereas without manual review and only the automated one, 84% of the possible interventions are detected. Conclusions: The automated system increases pharmaceutical interventions between 2.43 to 3.64 times. According to the results of this study the traditional validation system needs to be revised relying on automated systems. The automated program works correctly in different hospitals.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The hermit crab Clibanarius vittatus is a typical organism from intertidal regions being considered as a good bioindicator of tributyltin presence at these environments. Thus this study presents the analytical performance and validation method for TBT quantification in tissues of C. vittatus by gas chromatography with pulsed flame photometric detector (GC-PFPD) after extraction with an apolar solvent (toluene) and Grignard derivatization. The limits of detection of the method (LOD) were 2.0 and 2.8 ng g(-1) for TBT and DBT (dibutyltin), respectively, and its limits of quantification (LOQ) were 6.6 and 8.9 ng g(-1) for TBT and DBT, respectively. The method was applied to samples from Santos Estuary, Sao Paulo State, Brazil. TBT and DBT concentrations ranged from 26.7 to 175.0 ng g(-1) and from 46.2 to 156.0 ng g(-1), respectively. These concentrations are worrisome since toxic effects (such as endocrine disruption) have been reported for other organisms even under lower levels of registred at this study.
Resumo:
The hermit crab Clibanarius vittatus is a typical organism from intertidal regions being considered as a good bioindicator of tributyltin presence at these environments. Thus this study presents the analytical performance and validation method for TBT quantification in tissues of C. vittatus by gas chromatography with pulsed flame photometric detector (GC-PFPD) after extraction with an apolar solvent (toluene) and Grignard derivatization. The limits of detection of the method (LOD) were 2.0 and 2.8 ng g-1 for TBT and DBT (dibutyltin), respectively, and its limits of quantification (LOQ) were 6.6 and 8.9 ng g-1 for TBT and DBT, respectively. The method was applied to samples from Santos Estuary, São Paulo State, Brazil. TBT and DBT concentrations ranged from 26.7 to 175.0 ng g-1 and from 46.2 to 156.0 ng g-1, respectively. These concentrations are worrisome since toxic effects (such as endocrine disruption) have been reported for other organisms even under lower levels of registred at this study.
Resumo:
Client owners usually need an estimate or forecast of their likely building costs in advance of detailed design in order to confirm the financial feasibility of their projects. Because of their timing in the project life cycle, these early stage forecasts are characterized by the minimal amount of information available concerning the new (target) project to the point that often only its size and type are known. One approach is to use the mean contract sum of a sample, or base group, of previous projects of a similar type and size to the project for which the estimate is needed. Bernoulli’s law of large numbers implies that this base group should be as large as possible. However, increasing the size of the base group inevitably involves including projects that are less and less similar to the target project. Deciding on the optimal number of base group projects is known as the homogeneity or pooling problem. A method of solving the homogeneity problem is described involving the use of closed form equations to compare three different sampling arrangements of previous projects for their simulated forecasting ability by a cross-validation method, where a series of targets are extracted, with replacement, from the groups and compared with the mean value of the projects in the base groups. The procedure is then demonstrated with 450 Hong Kong projects (with different project types: Residential, Commercial centre, Car parking, Social community centre, School, Office, Hotel, Industrial, University and Hospital) clustered into base groups according to their type and size.
Resumo:
Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.
Resumo:
Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.
Resumo:
The inverse problem in the diffuse optical tomography is known to be nonlinear, ill-posed, and sometimes under-determined, requiring regularization to obtain meaningful results, with Tikhonov-type regularization being the most popular one. The choice of this regularization parameter dictates the reconstructed optical image quality and is typically chosen empirically or based on prior experience. An automated method for optimal selection of regularization parameter that is based on regularized minimal residual method (MRM) is proposed and is compared with the traditional generalized cross-validation method. The results obtained using numerical and gelatin phantom data indicate that the MRM-based method is capable of providing the optimal regularization parameter. (C) 2012 Society of Photo-Optical Instrumentation Engineers (SPIE). DOI: 10.1117/1.JBO.17.10.106015]
Resumo:
Jakarta is vulnerable to flooding mainly caused by prolonged and heavy rainfall and thus a robust hydrological modeling is called for. A good quality of spatial precipitation data is therefore desired so that a good hydrological model could be achieved. Two types of rainfall sources are available: satellite and gauge station observations. At-site rainfall is considered to be a reliable and accurate source of rainfall. However, the limited number of stations makes the spatial interpolation not very much appealing. On the other hand, the gridded rainfall nowadays has high spatial resolution and improved accuracy, but still, relatively less accurate than its counterpart. To achieve a better precipitation data set, the study proposes cokriging method, a blending algorithm, to yield the blended satellite-gauge gridded rainfall at approximately 10-km resolution. The Global Satellite Mapping of Precipitation (GSMaP, 0.1⁰×0.1⁰) and daily rainfall observations from gauge stations are used. The blended product is compared with satellite data by cross-validation method. The newly-yield blended product is then utilized to re-calibrate the hydrological model. Several scenarios are simulated by the hydrological models calibrated by gauge observations alone and blended product. The performance of two calibrated hydrological models is then assessed and compared based on simulated and observed runoff.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Flucloxacillin sodium (FLU) is a semi-synthetic penicillin active against many gram-positive bacteria such as streptococci and penicilinase-producing staphylococci, including methicillin-susceptible S. aureus. This study describes the development and validation of a microbiological assay, applying the diffusion agar method for the determination of FLU, as well as the evaluation of the ability of the method in determining the stability of FLU in capsules against acidic and basic hydrolysis, photolytic and oxidative degradations, using S. aureus ATCC 25923 as micro-organism test and 3 x 3 parallel line assay design (three doses of the standard and three doses of the sample in each plate), with six plates for each assay, according to the Brazilian Pharmacopoeia. The validation method showed good results including linearity, precision, accuracy, robustness and selectivity. The assay is based on the inhibitory effect of FLU using Staphylococcus aureus ATCC 25923. The results of the assay were treated by analysis of variance (ANOVA) and were found to be linear (r = 0.9997) in the range from 1.5 to 6.0 μg/mL, precise (repeatability: R.S.D. = 1.63 and intermediate precision: R.S.D. = 1.64) and accurate (98.96%). FLU solution (from the capsules) exposed to direct UVC light (254 nm), alkaline and acid hydrolysis and hydrogen peroxide causing oxidation were used to evaluate the specificity of the bioassay. Comparison of bioassay and liquid chromatography by ANOVA showed no difference between methodologies. The results demonstrated the validity of the proposed bioassay, which is a simple and useful alternative methodology for FLU determination in routine quality control.