954 resultados para Computer software - Quality control


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using water quality management programs is a necessary and inevitable way for preservation and sustainable use of water resources. One of the important issues in determining the quality of water in rivers is designing effective quality control networks, so that the measured quality variables in these stations are, as far as possible, indicative of overall changes in water quality. One of the methods to achieve this goal is increasing the number of quality monitoring stations and sampling instances. Since this will dramatically increase the annual cost of monitoring, deciding on which stations and parameters are the most important ones, along with increasing the instances of sampling, in a way that shows maximum change in the system under study can affect the future decision-making processes for optimizing the efficacy of extant monitoring network, removing or adding new stations or parameters and decreasing or increasing sampling instances. This end, the efficiency of multivariate statistical procedures was studied in this thesis. Multivariate statistical procedure, with regard to its features, can be used as a practical and useful method in recognizing and analyzing rivers’ pollution and consequently in understanding, reasoning, controlling, and correct decision-making in water quality management. This research was carried out using multivariate statistical techniques for analyzing the quality of water and monitoring the variables affecting its quality in Gharasou river, in Ardabil province in northwest of Iran. During a year, 28 physical and chemical parameters were sampled in 11 stations. The results of these measurements were analyzed by multivariate procedures such as: Cluster Analysis (CA), Principal Component Analysis (PCA), Factor Analysis (FA), and Discriminant Analysis (DA). Based on the findings from cluster analysis, principal component analysis, and factor analysis the stations were divided into three groups of highly polluted (HP), moderately polluted (MP), and less polluted (LP) stations Thus, this study illustrates the usefulness of multivariate statistical techniques for analysis and interpretation of complex data sets, and in water quality assessment, identification of pollution sources/factors and understanding spatial variations in water quality for effective river water quality management. This study also shows the effectiveness of these techniques for getting better information about the water quality and design of monitoring network for effective management of water resources. Therefore, based on the results, Gharasou river water quality monitoring program was developed and presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis reports on an investigation of the feasibility and usefulness of incorporating dynamic management facilities for managing sensed context data in a distributed contextaware mobile application. The investigation focuses on reducing the work required to integrate new sensed context streams in an existing context aware architecture. Current architectures require integration work for new streams and new contexts that are encountered. This means of operation is acceptable for current fixed architectures. However, as systems become more mobile the number of discoverable streams increases. Without the ability to discover and use these new streams the functionality of any given device will be limited to the streams that it knows how to decode. The integration of new streams requires that the sensed context data be understood by the current application. If the new source provides data of a type that an application currently requires then the new source should be connected to the application without any prior knowledge of the new source. If the type is similar and can be converted then this stream too should be appropriated by the application. Such applications are based on portable devices (phones, PDAs) for semi-autonomous services that use data from sensors connected to the devices, plus data exchanged with other such devices and remote servers. Such applications must handle input from a variety of sensors, refining the data locally and managing its communication from the device in volatile and unpredictable network conditions. The choice to focus on locally connected sensory input allows for the introduction of privacy and access controls. This local control can determine how the information is communicated to others. This investigation focuses on the evaluation of three approaches to sensor data management. The first system is characterised by its static management based on the pre-pended metadata. This was the reference system. Developed for a mobile system, the data was processed based on the attached metadata. The code that performed the processing was static. The second system was developed to move away from the static processing and introduce a greater freedom of handling for the data stream, this resulted in a heavy weight approach. The approach focused on pushing the processing of the data into a number of networked nodes rather than the monolithic design of the previous system. By creating a separate communication channel for the metadata it is possible to be more flexible with the amount and type of data transmitted. The final system pulled the benefits of the other systems together. By providing a small management class that would load a separate handler based on the incoming data, Dynamism was maximised whilst maintaining ease of code understanding. The three systems were then compared to highlight their ability to dynamically manage new sensed context. The evaluation took two approaches, the first is a quantitative analysis of the code to understand the complexity of the relative three systems. This was done by evaluating what changes to the system were involved for the new context. The second approach takes a qualitative view of the work required by the software engineer to reconfigure the systems to provide support for a new data stream. The evaluation highlights the various scenarios in which the three systems are most suited. There is always a trade-o↵ in the development of a system. The three approaches highlight this fact. The creation of a statically bound system can be quick to develop but may need to be completely re-written if the requirements move too far. Alternatively a highly dynamic system may be able to cope with new requirements but the developer time to create such a system may be greater than the creation of several simpler systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet has grown in size at rapid rates since BGP records began, and continues to do so. This has raised concerns about the scalability of the current BGP routing system, as the routing state at each router in a shortest-path routing protocol will grow at a supra-linearly rate as the network grows. The concerns are that the memory capacity of routers will not be able to keep up with demands, and that the growth of the Internet will become ever more cramped as more and more of the world seeks the benefits of being connected. Compact routing schemes, where the routing state grows only sub-linearly relative to the growth of the network, could solve this problem and ensure that router memory would not be a bottleneck to Internet growth. These schemes trade away shortest-path routing for scalable memory state, by allowing some paths to have a certain amount of bounded “stretch”. The most promising such scheme is Cowen Routing, which can provide scalable, compact routing state for Internet routing, while still providing shortest-path routing to nearly all other nodes, with only slightly stretched paths to a very small subset of the network. Currently, there is no fully distributed form of Cowen Routing that would be practical for the Internet. This dissertation describes a fully distributed and compact protocol for Cowen routing, using the k-core graph decomposition. Previous compact routing work showed the k-core graph decomposition is useful for Cowen Routing on the Internet, but no distributed form existed. This dissertation gives a distributed k-core algorithm optimised to be efficient on dynamic graphs, along with with proofs of its correctness. The performance and efficiency of this distributed k-core algorithm is evaluated on large, Internet AS graphs, with excellent results. This dissertation then goes on to describe a fully distributed and compact Cowen Routing protocol. This protocol being comprised of a landmark selection process for Cowen Routing using the k-core algorithm, with mechanisms to ensure compact state at all times, including at bootstrap; a local cluster routing process, with mechanisms for policy application and control of cluster sizes, ensuring again that state can remain compact at all times; and a landmark routing process is described with a prioritisation mechanism for announcements that ensures compact state at all times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the challenges to biomedical engineers proposed by researchers in neuroscience is brain machine interaction. The nervous system communicates by interpreting electrochemical signals, and implantable circuits make decisions in order to interact with the biological environment. It is well known that Parkinson’s disease is related to a deficit of dopamine (DA). Different methods has been employed to control dopamine concentration like magnetic or electrical stimulators or drugs. In this work was automatically controlled the neurotransmitter concentration since this is not currently employed. To do that, four systems were designed and developed: deep brain stimulation (DBS), transmagnetic stimulation (TMS), Infusion Pump Control (IPC) for drug delivery, and fast scan cyclic voltammetry (FSCV) (sensing circuits which detect varying concentrations of neurotransmitters like dopamine caused by these stimulations). Some softwares also were developed for data display and analysis in synchronously with current events in the experiments. This allowed the use of infusion pumps and their flexibility is such that DBS or TMS can be used in single mode and other stimulation techniques and combinations like lights, sounds, etc. The developed system allows to control automatically the concentration of DA. The resolution of the system is around 0.4 µmol/L with time correction of concentration adjustable between 1 and 90 seconds. The system allows controlling DA concentrations between 1 and 10 µmol/L, with an error about +/- 0.8 µmol/L. Although designed to control DA concentration, the system can be used to control, the concentration of other substances. It is proposed to continue the closed loop development with FSCV and DBS (or TMS, or infusion) using parkinsonian animals models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objetivos: Listar las variables imprescindibles en los cuadros de mando integrales para abarcar todas las áreas básicas de trabajo en una Unidad de Radiofarmacia hospitalaria, cuya adecuada gestión puede ser clave para optimizar los recursos disponibles. En segundo lugar, enumerar los beneficios que redundan en la práctica de trabajo diario tras su integración. Métodos: Revisión de la bibliografía publicada sobre cuadros de mando integrales, seleccionando las variables para que el radiofarmacéutico asuma un papel activo en la mejora de su área de trabajo. Se utilizan programas construídos en Microsoft Access para la gestión integral. Se administran en varios módulos toda la información desde la prescripción y citación de los pacientes (asignándoles un código) hasta que se les realiza la exploración. Se recogen también variables como: fecha y hora límites de tramitación de radiofármaco al proveedor; fecha prueba médica; gestión de eluciones de generadores y kits fríos; turnos de trabajo del personal; registro de incidencias tipificadas y de datos de recepción, marcaje, control de calidad y dispensación de cada radiofármaco (asegurando la trazabilidad); detección de desviaciones entre actividad calibrada y medida; la actividad dispensada y la disponible a tiempo real; gestión de eliminación de residuos radiactivos, de existencias y caducidades; fechas de próximas revisiones de equipos; archivado de PNT; sistemas conversores de unidades y registro de informes clínicos. Resultados: Los programas especializados gestionan la información que se maneja en la Unidad de Radiofarmacia, facilitando tomar decisiones coste-efectivas. Los parámetros analizados son: número de preparaciones elaboradas y actividad manejada; posibles incidencias en cualquiera de los procesos cotidianos; porcentaje de resolución satisfactoria sin que derive en falta de disponibilidad; correcta trazabilidad de los radiofármacos; porcentaje de controles de calidad satisfactorios; evolución en el consumo por tipo de radiofármaco, etc. La mejora en la gestión de pedidos asegura la presencia del radiofármaco necesario para cada exploración. Conclusiones: Estos nuevos cuadros de mando integrales son útiles para optimizar pedidos y radiofármacos, asegurar trazabilidad, gestionar inventario, informes clínicos, residuos radiactivos y para evaluar la eficiencia de la Unidad de radiofarmacia, permitiendo la integración de estos datos con otros softwares de gestión sanitaria. Esta metodología puede aplicarse en Centros Sanitarios de Atención Primaria para enfocar al personal en sus funciones asistenciales y operativas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The aim of this study was to assess the quality of rapid HIV testing in South Africa. Method: A two-stage sampling procedure was used to select HCT sites in eight provinces of South Africa. The study employed both semi-structured interviews with HIV testers and observation of testing sessions as a means of data collection. In total, 63 HCT sites (one HIV tester per site) were included in the survey assessing qualification, training, testing practices and attitudes towards rapid tests. Quantitative data was analysed using descriptive statistics and qualitative data was content analysed. Results: Of the 63 HIV testers, 20.6% had a nursing qualification, 14.3% were professional counsellors, 58.7% were lay HIV counsellors and testers and 6.4% were from other professions. Most HIV testers (87.3%) had had a formal training in testing, which ranged between 10-14 days, while 6 (9.5%) had none. Findings revealed sub-standard practices in relation to testing. These were mainly related to non-adherence to testing algorithms, poor external quality control practices, poor handling and communication of discordant results. Conclusion: Quality of HIV rapid testing may be highly compromised through poor adherence to guidelines as observed in our study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese (Doutorado em Tecnologia Nuclear)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In quantitative risk analysis, the problem of estimating small threshold exceedance probabilities and extreme quantiles arise ubiquitously in bio-surveillance, economics, natural disaster insurance actuary, quality control schemes, etc. A useful way to make an assessment of extreme events is to estimate the probabilities of exceeding large threshold values and extreme quantiles judged by interested authorities. Such information regarding extremes serves as essential guidance to interested authorities in decision making processes. However, in such a context, data are usually skewed in nature, and the rarity of exceedance of large threshold implies large fluctuations in the distribution's upper tail, precisely where the accuracy is desired mostly. Extreme Value Theory (EVT) is a branch of statistics that characterizes the behavior of upper or lower tails of probability distributions. However, existing methods in EVT for the estimation of small threshold exceedance probabilities and extreme quantiles often lead to poor predictive performance in cases where the underlying sample is not large enough or does not contain values in the distribution's tail. In this dissertation, we shall be concerned with an out of sample semiparametric (SP) method for the estimation of small threshold probabilities and extreme quantiles. The proposed SP method for interval estimation calls for the fusion or integration of a given data sample with external computer generated independent samples. Since more data are used, real as well as artificial, under certain conditions the method produces relatively short yet reliable confidence intervals for small exceedance probabilities and extreme quantiles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives This study was an in-vitro evaluation of different brands of paracetamol and cotrimoxazole tablets, used or found in Malawi, based on Pharmacopoeia standards, in order to ascertain the existence and extent of substandard medicines in Malawi and to give an overview of their distribution in the public and private sectors. Methodology A cross-sectional analytical study was conducted using 11 samples each of paracetamol and cotrimoxazole tablets. Stratified random sampling was used to collect samples. Samples were analyzed using HPLC and Spectrophometric methods as outlined in the BP-2007 and USP-32 at the National Drug Quality Control Laboratory (NDQCL)-Lilongwe (under Pharmacy Medicines and Poisons Board-PMPB) and Orient Pharma Co. Ltd of Taiwan. The results were analyzed using Epi Info. Results and discussion Fifty percent of samples (n=22) were not registered in the country by the PMPB as required by the PMP Act with the majority of those coming from public health facilities. All paracetamol and cotrimoxazole samples complied with identification tests using spectrophotometric and HPLC method. Overall, 27.3% of samples failed to meet the BP-2007 standards for Active Ingredient content, while 22.7% of the samples failed the Friability test. The results from Malawi are similar in magnitude to those within surrounding countries in Africa. Conclusion This pilot study provides objective evidence to show that substandard and unregistered paracetamol and cotrimoxazole are present and being used in Malawi, and thus posing a considerable hazard to public health in Malawi. PMPB, together with the Ministry of Health, must continue to develop a quality assurance system to ensure that medicines are randomly and routinely checked.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Independientemente de la metodología que se adopte en el desarrollo de software, se contemplan aquellas actividades gerenciales o de dirección del proyecto y las inherentes a las técnicas, propias del desarrollo del producto como tal, como los requerimientos demandados, análisis, diseño, implementación y pruebas o ensayos previos a su materialización -- El presente trabajo se deriva del interés por diseñar una metodología para la gestión de la fase de pruebas y ensayo, con base en el modelo de integración de las actividades contempladas en la guía del PMBOK, la cual es compatible con las funciones de dirección y actividades técnicas de otras metodologías, especialmente en su etapa de prueba; de allí la importancia que representa para los gerentes de proyectos obtener resultados satisfactorios en esta fase, por su impacto directo y significativo en el cumplimiento del tiempo y los costos estimados, lo que permite prevenir o mitigar, tiempos adicionales o sobrecostos por reproceso, evitando ser transferidos al cliente o asumidos por el fabricante de software -- Así mismo, asegurar una ejecución correcta de la fase de pruebas y ensayo, garantiza que el proyecto responda a los estándares de calidad, de acuerdo con sus indicadores de medición y la satisfacción del usuario

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increased prevalence of iron deficiency among infants can be attributed to the consumption of an iron deficient diet or a diet that interferes with iron absorption at the critical time of infancy, among other factors. The gradual shift from breast milk to other foods and liquids is a transition period which greatly contributes to iron deficiency anaemia (IDA). The purpose of this research was to assess iron deficiency anaemia among infants aged six to nine months in Keiyo South Sub County. The specific objectives of this study were: to establish the prevalence of infants with iron deficiency anaemia and dietary iron intake among infants aged 6 to 9 months. The cross sectional study design was adopted in this survey. This study was conducted in three health facilities in Keiyo South Sub County. The infants were selected by use of a two stage cluster sampling procedure. Systematic random sampling was then used to select a total of 244 mothers and their infants. Eighty two (82) infants were selected from Kamwosor sub-district hospital and eighty one (81) from both Nyaru and Chepkorio health facilities. Interview schedules, 24-hour dietary recall and food frequency questionnaires were used for collection of dietary iron intake. Biochemical tests were carried out by use of the Hemo-control photochrometer at the health facilities. Infants whose hemoglobin levels were less than 11g/dl were considered anaemic. Further, peripheral blood smears were conducted to ascertain the type of nutritional anaemia. Data was analyzed using the Statistical Package for Social Sciences (SPSS) computer software version 17, 2009. Dietary iron intake was analyzed using the NutriSurvey 2007 computer software. Results indicated that the mean hemoglobin values were 11.3± 0.84 g/dl. Twenty one percent (21.7%) of the infants had anaemia and further 100% of peripheral blood smears indicated iron deficiency anaemia. Dietary iron intake was a predictor of iron deficiency anaemia in this study (t=-3.138; p=0.01). Iron deficiency anaemia was evident among infants in Keiyo South Sub County. The Ministry of Health should formulate and implement policies on screening for anaemia and ensure intensive nutrition education on iron rich diets during child welfare clinics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To develop a high-performance liquid chromatography (HPLC) fingerprint method for the quality control and origin discrimination of Gastrodiae rhizoma . Methods: Twelve batches of G. rhizoma collected from Sichuan, Guizhou and Shanxi provinces in china were used to establish the fingerprint. The chromatographic peak (gastrodin) was taken as the reference peak, and all sample separation was performed on a Agilent C18 (250 mm×4.6 mmx5 μm) column with a column temperature of 25 °C. The mobile phase was acetonitrile/0.8 % phosphate water solution (in a gradient elution mode) and the flow rate of 1 mL/min. The detection wavelength was 270 nm. The method was validated as per the guidelines of Chinese Pharmacopoeia. Results: The chromatograms of the samples showed 11 common peaks, of which no. 4 was identified as that of Gastrodin. Data for the samples were analyzed statistically using similarity analysis and hierarchical cluster analysis (HCA). The similarity index between reference chromatogram and samples’ chromatograms were all > 0.80. The similarity index of G. rhizoma from Guizhou, Shanxi and Sichuan is evident as follows: 0.854 - 0.885, 0.915 - 0.930 and 0.820 - 0.848, respectively. The samples could be divided into three clusters at a rescaled distance of 7.5: S1 - S4 as cluster 1; S5 - S8 cluster 2, and others grouped into cluster 3. Conclusion: The findings indicate that HPLC fingerprinting technology is appropriate for quality control and origin discrimination of G. rhizoma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A comprehensive user model, built by monitoring a user's current use of applications, can be an excellent starting point for building adaptive user-centred applications. The BaranC framework monitors all user interaction with a digital device (e.g. smartphone), and also collects all available context data (such as from sensors in the digital device itself, in a smart watch, or in smart appliances) in order to build a full model of user application behaviour. The model built from the collected data, called the UDI (User Digital Imprint), is further augmented by analysis services, for example, a service to produce activity profiles from smartphone sensor data. The enhanced UDI model can then be the basis for building an appropriate adaptive application that is user-centred as it is based on an individual user model. As BaranC supports continuous user monitoring, an application can be dynamically adaptive in real-time to the current context (e.g. time, location or activity). Furthermore, since BaranC is continuously augmenting the user model with more monitored data, over time the user model changes, and the adaptive application can adapt gradually over time to changing user behaviour patterns. BaranC has been implemented as a service-oriented framework where the collection of data for the UDI and all sharing of the UDI data are kept strictly under the user's control. In addition, being service-oriented allows (with the user's permission) its monitoring and analysis services to be easily used by 3rd parties in order to provide 3rd party adaptive assistant services. An example 3rd party service demonstrator, built on top of BaranC, proactively assists a user by dynamic predication, based on the current context, what apps and contacts the user is likely to need. BaranC introduces an innovative user-controlled unified service model of monitoring and use of personal digital activity data in order to provide adaptive user-centred applications. This aims to improve on the current situation where the diversity of adaptive applications results in a proliferation of applications monitoring and using personal data, resulting in a lack of clarity, a dispersal of data, and a diminution of user control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Ph.D. project aimed to the development and improvement of analytical solutions for control of quality and authenticity of virgin olive oils. According to this main objective, different research activities were carried out: concerning the quality control of olive oil, two of the official parameters defined by regulations (free acidity and fatty acid ethyl esters) were taken into account, and more sustainable and easier analytical solutions were developed and validated in-house. Regarding authenticity, two different issues were faced: verification of the geographical origin of extra virgin (EVOOs) and virgin olive oils (VOOs), and assessment of soft-deodorized oils illegally mixed with EVOOs. About fatty acid ethyl esters, a revised method based on the application of off-line HPLC-GC-FID (with PTV injector), revising both the preparative phase and the GC injector required in the official method, was developed. Next, the method was in-house validated evaluating several parameters. Concerning free acidity, a portable system suitable for in-situ measurements of VOO free acidity was developed and in-house validated. Its working principle is based on the estimation of the olive oil free acidity by measuring the conductance of an emulsion between a hydro-alcoholic solution and the sample to be tested. The procedure is very quick and easy and, therefore, suitable for people without specific training. Another study developed during the Ph.D. was about the application of flash gas chromatography for volatile compounds analysis, combined with untargeted chemometric data elaborations, for discrimination of EVOOs and VOOs of different geographical origin. A set of 210 samples coming from different EU member states and extra-EU countries were collected and analyzed. Data were elaborated applying two different classification techniques, one linear (PLS-DA) and one non-linear (ANN). Finally, a preliminary study about the application of GC-IMS (Gas Chromatograph - Ion Mobility Spectrometer) for assessment of soft-deodorized olive oils was carried out.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In pursuit of aligning with the European Union's ambitious target of achieving a carbon-neutral economy by 2050, researchers, vehicle manufacturers, and original equipment manufacturers have been at the forefront of exploring cutting-edge technologies for internal combustion engines. The introduction of these technologies has significantly increased the effort required to calibrate the models implemented in the engine control units. Consequently the development of tools that reduce costs and the time required during the experimental phases, has become imperative. Additionally, to comply with ever-stricter limits on 〖"CO" 〗_"2" emissions, it is crucial to develop advanced control systems that enhance traditional engine management systems in order to reduce fuel consumption. Furthermore, the introduction of new homologation cycles, such as the real driving emissions cycle, compels manufacturers to bridge the gap between engine operation in laboratory tests and real-world conditions. Within this context, this thesis showcases the performance and cost benefits achievable through the implementation of an auto-adaptive closed-loop control system, leveraging in-cylinder pressure sensors in a heavy-duty diesel engine designed for mining applications. Additionally, the thesis explores the promising prospect of real-time self-adaptive machine learning models, particularly neural networks, to develop an automatic system, using in-cylinder pressure sensors for the precise calibration of the target combustion phase and optimal spark advance in a spark-ignition engines. To facilitate the application of these combustion process feedback-based algorithms in production applications, the thesis discusses the results obtained from the development of a cost-effective sensor for indirect cylinder pressure measurement. Finally, to ensure the quality control of the proposed affordable sensor, the thesis provides a comprehensive account of the design and validation process for a piezoelectric washer test system.