909 resultados para data accuracy


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Inventory data management is defined as an accurate creation and maintenance of item master data, inventory location data and inventory balances per an inventory location. The accuracy of inventory data is dependent of many elements of the product data management like the management of changes during a component’s life-cycle and the accuracy of product configuration in enterprise resource planning systems. Cycle-counting is counting of inventory balances per an inventory location and comparing them to the system data on a daily basis. The Cycle-counting process is a way to measure the accuracy of the inventory data in a company. Through well managed cycle-counting process a company gets a lot of information of their inventory data accuracy. General inaccuracy of the inventory data can’t be fixed through only assigning resources to the cycle-counting but the change requires disciplined following of the defined processes from all parties involved in updating inventory data through a component’s life-cycle. The Processes affecting inventory data are mapped and the appropriate metrics are defined in order to achieve better manageability of the inventory data. The Life-cycles of a single component and of a product are used in evaluation of the processes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of this thesis is to research Manufacturing Planning and Control (MPC) system and Master Scheduling (MS) in a manufacturing firm. The study is conducted at Ensto Finland Corporation, which operates on a field of electrical systems and supplies. The paper consists of theoretical and empirical parts. The empirical part is based on weekly operating at Ensto and includes inter-firm material analysis, learning and meetings. Master Scheduling is an important module of an MPC system, since it is beneficial on transforming strategic production plans based on demand forecasting into operational schedules. Furthermore, capacity planning tools can remarkably contribute to production planning: by Rough-Cut Capacity Planning (RCCP) tool, a MS plan can be critically analyzed in terms of available key resources in real manufacturing environment. Currently, there are remarkable inefficiencies when it comes to Ensto’s practices: the system is not able to take into consideration seasonal demand and react on market changes on time; This can cause significant lost sales. However, these inefficiencies could be eliminated through the appropriate utilization of MS and RCCP tools. To utilize MS and RCCP tools in Ensto’s production environment, further testing in real production environment is required. Moreover, data accuracy, appropriate commitment to adapting and learning the new tools, and continuous developing of functions closely related to MS, such as sales forecasting, need to be ensured.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The amateur birding community has a long and proud tradition of contributing to bird surveys and bird atlases. Coordinated activities such as Breeding Bird Atlases and the Christmas Bird Count are examples of "citizen science" projects. With the advent of technology, Web 2.0 sites such as eBird have been developed to facilitate online sharing of data and thus increase the potential for real-time monitoring. However, as recently articulated in an editorial in this journal and elsewhere, monitoring is best served when based on a priori hypotheses. Harnessing citizen scientists to collect data following a hypothetico-deductive approach carries challenges. Moreover, the use of citizen science in scientific and monitoring studies has raised issues of data accuracy and quality. These issues are compounded when data collection moves into the Web 2.0 world. An examination of the literature from social geography on the concept of "citizen sensors" and volunteered geographic information (VGI) yields thoughtful reflections on the challenges of data quality/data accuracy when applying information from citizen sensors to research and management questions. VGI has been harnessed in a number of contexts, including for environmental and ecological monitoring activities. Here, I argue that conceptualizing a monitoring project as an experiment following the scientific method can further contribute to the use of VGI. I show how principles of experimental design can be applied to monitoring projects to better control for data quality of VGI. This includes suggestions for how citizen sensors can be harnessed to address issues of experimental controls and how to design monitoring projects to increase randomization and replication of sampled data, hence increasing scientific reliability and statistical power.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A tese aqui apresentada trata-se do primeiro estudo em nível mundial que pesquisa e caracteriza a resposta imune citocínica em infecções humanas pelo Orthobunyavirus Oropuche. Como metodologia para o alcance dos objetivos aqui apresentados foi utilizado um total de 320 amostras de soros humanos, onde 60 destas foram provenientes de Banco de Sangue (Controle negativo) e 260 foram obtidas mediante dois surtos do Vírus Oropouche nos Estados do Pará e Amapá (Brasil), sendo estas últimas divididas em oito subgrupos para obtenção dos dados com exatidão. Nas amostras coletadas foram realizadas análises dos dados clínicos/sintomatologia através dos prontuários, dados sorológicos através da titulação de anticorpos por Inibição da Hemaglutinação (IgM/IgG) e detecção do nível de citocinas plasmáticas por citometria de fluxo a qual permitiu a descrição técnica da dosagem de citocinas possibilitando ainda a análise de frequência de baixos e altos produtores de citocina. Os dados obtidos permitiram observar as variáveis e o comportamento das assinaturas de citocinas expressas pelos pacientes mediante a confirmação sorológica do vírus, bem como o comportamento destes analitos séricos quando da presença de sintomas específicos como febre, calafrios, cefaléia e tontura, permitindo assim que se chegasse à conclusão que a) existe um padrão na síntese de citocinas pró-inflamatórias e reguladoras; b) observa-se um balanço no perfil da resposta imune entre citocinas pró-inflamatórias (Th1) e moduladoras (Th17); c) a infecção pelo Vírus Oropouche altera a produção das citocinas nos indivíduos; d) os resultados mostram também que ao comparar os indivíduos Não respondedores com os Respondedores precoces, houve aumento da IL-1β e diminuição da IL-12; Não respondedores com Respondedores tardios, houve diminuição da IL-8, e aumento da IFN-α, IL-23 e IL-17; Não respondedores comparados com Respondedores precoces ocorreram o aumento de IL-4 e IFN-; Já quando comparado Respondedores precoces e respondedores tardios houve diminuição de IFN-α e IL-6; Respondedores precoces de forma geral apresentaram diminuição da IL-10 e Respondedores tardios apresentaram aumento da IL-5; e) Os resultados mostram ainda a expressão de IL-5 em pacientes que manifestaram os sintomas específicos para a infecção pelo Oropouche (febre, calafrios, cefaléia e tontura), sugerindo este sinal estar associado diretamente à patogênese do vírus; f) há a necessidade da complementação desta pesquisa com mais estudos como àqueles relacionados com a expressão de quimiocinas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

INTRODUCTION Data concerning outcome after management of acetabular fractures by anterior approaches with focus on age and fractures associated with roof impaction, central dislocation and/or quadrilateral plate displacement are rare. METHODS Between October 2005 and April 2009 a series of 59 patients (mean age 57 years, range 13-91) with fractures involving the anterior column was treated using the modified Stoppa approach alone or for reduction of displaced iliac wing or low anterior column fractures in combination with the 1st window of the ilioinguinal approach or the modified Smith-Petersen approach, respectively. Surgical data, accuracy of reduction, clinical and radiographic outcome at mid-term and the need for endoprosthetic replacement in the postoperative course (defined as failure) were assessed; uni- and multivariate regression analysis were performed to identify independent predictive factors (e.g. age, nonanatomical reduction, acetabular roof impaction, central dislocation, quadrilateral plate displacement) for a failure. Outcome was assessed for all patients in general and in accordance to age in particular; patients were subdivided into two groups according to their age (group "<60yrs", group "≥60yrs"). RESULTS Forty-three of 59 patients (mean age 54yrs, 13-89) were available for evaluation. Of these, anatomic reduction was achieved in 72% of cases. Nonanatomical reduction was identified as being the only multivariate predictor for subsequent total hip replacement (Adjusted Hazard Ratio 23.5; p<0.01). A statistically significant higher rate of nonanatomical reduction was observed in the presence of acetabular roof impaction (p=0.01). In 16% of all patients, total hip replacement was performed and in 69% of patients with preserved hips the clinical results were excellent or good at a mean follow up of 35±10 months (range: 24-55). No statistical significant differences were observed between both groups. CONCLUSION Nonanatomical reconstruction of the articular surfaces is at risk for failure of joint-preserving management of acetabular fractures through an isolated or combined modified Stoppa approach resulting in total joint replacement at mid-term. In the elderly, joint-preserving surgery is worth considering as promising clinical and radiographic results might be obtained at mid-term.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Increasing commercial pressures on land are provoking fundamental and far-reaching changes in the relationships between people and land. Much knowledge on land-oriented investments projects currently comes from the media. Although this provides a good starting point, lack of transparency and rapidly changing contexts mean that this is often unreliable. The International Land Coalition, in partnership with Oxfam Novib, Centre de coopération internationale en recherche agronomique pour le développement (CIRAD), University of Pretoria, Centre for Development and Environment of the University of Bern (CDE), and GIZ, started to compile an inventory of land-related investments. This project aims to better understand the extent, trends and impacts of land-related investments by supporting an ongoing and systematic stocktaking exercise of the various investment projects currently taking place worldwide. It involves a large number of organizations and individuals working in areas where land transactions are being made, and able to provide details of such investments. The project monitors land transactions in rural areas that imply a transformation of land use rights from communities and smallholders to commercial use, and are made both by domestic and foreign investors (private actors, governments, government-back private investors). The focus is on investments for food or agrofuel production, timber extraction, carbon trading, mineral extraction, conservation and tourism. A novel way of using ITC to document land acquisitions in a spatially explicit way and by using an approach called “crowdsourcing” is being developed. This approach will allow actors to share information and knowledge directly and at any time on a public platform, where it will be scrutinized in terms of reliability and cross checked with other sources. Up to now, over 1200 deals have been recorded across 96 countries. Details of such transactions have been classified in a matrix and distributed to over 350 contacts worldwide for verification. The verified information has been geo-referenced and represented in two global maps. This is an open database enabling a continued monitoring exercise and the improvement of data accuracy. More information will be released over time. The opportunities arise from overcoming constraints by incomplete information by proposing a new way of collecting, enhancing and sharing information and knowledge in a more democratic and transparent manner. The intention is to develop interactive knowledge platform where any interested person can share and access information on land deals, their link to involved stakeholders, and their embedding into a geographical context. By making use of new ICT technologies that are more and more in the reach of local stakeholders, as well as open access and web-based spatial information systems, it will become possible to create a dynamic database containing spatial explicit data. Feeding in data by a large number of stakeholders, increasingly also by means of new mobile ITC technologies, will open up new opportunities to analyse, monitor and assess highly dynamic trends of land acquisition and rural transformation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La información básica sobre el relieve de una cuenca hidrográfica, mediante metodologías analítico-descriptivas, permite a quienes evalúan proyectos relacionados con el uso de los recursos naturales, tales como el manejo integrado de cuencas, estudios sobre impacto ambiental, degradación de suelos, deforestación, conservación de los recursos hídricos, entre otros, contar para su análisis con los parámetros físicos necesarios. Estos procesos mencionados tienen un fuerte componente espacial y el empleo de Sistemas de Información Geográfica (SIG) son de suma utilidad, siendo los Modelos Digitales de Elevación (DEM) y sus derivados un componente relevante de esta base de datos. Los productos derivados de estos modelos, como pendiente, orientación o curvatura, resultarán tan precisos como el DEM usado para derivarlos. Por otra parte, es fundamental maximizar la habilidad del modelo para representar las variaciones del terreno; para ello se debe seleccionar una adecuada resolución (grilla) de acuerdo con los datos disponibles para su generación. En este trabajo se evalúa la calidad altimétrica de seis DEMs generados a partir de dos sistemas diferentes de captura de datos fuente y de distintas resoluciones de grilla. Para determinar la exactitud de los DEMs habitualmente se utiliza un grupo de puntos de control considerados como "verdad de campo" que se comparan con los generados por el modelo en la misma posición geográfica. El área seleccionada para realizar el estudio está ubicada en la localidad de Arrecifes, provincia de Buenos Aires (Argentina) y tiene una superficie de aproximadamente 120 ha. Los resultados obtenidos para los dos algoritmos y para los tres tamaños de grilla analizados presentaron los siguientes resultados: el algoritmo DEM from contourn, un RMSE (Root Mean Squared Error) de ± 0,11 m (para grilla de 1 m), ± 0,11 m (para grilla de 5 m) y de ± 0,15 m (para grilla de 10 m). Para el algoritmo DEM from vector/points, un RMSE de ± 0,09 m (para grilla de 1 m), ± 0,11 m (para grilla de 5 m) y de ± 0,11 m (para grilla de 10 m). Los resultados permiten concluir que el DEM generado a partir de puntos acotados del terreno como datos fuente y con el menor tamaño de grilla es el único que satisface los valores enumerados en la bibliografía, tanto nacional como internacional, lo que lo hace apto para proyectos relacionados con recursos naturales a nivel de ecotopo (predial). El resto de los DEMs generados presentan un RMSE que permite asegurar su aptitud para la evaluación de proyectos relacionados con el uso de los recursos naturales a nivel de unidad de paisaje (conjunto de ecotopos).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new topographic database for King George Island, one of the most visited areas in Antarctica, is presented. Data from differential GPS surveys, gained during the summers 1997/98 and 1999/2000, were combined with up to date coastlines from a SPOT satellite image mosaic, and topographic information from maps as well as from the Antarctic Digital Database. A digital terrain model (DTM) was generated using ARC/INFO GIS. From contour lines derived from the DTM and the satellite image mosaic a satellite image map was assembled. Extensive information on data accuracy, the database as well as on the criteria applied to select place names is given in the multilingual map. A lack of accurate topographic information in the eastern part of the island was identified. It was concluded that additional topographic surveying or radar interferometry should be conducted to improve the data quality in this area. In three case studies, the potential applications of the improved topographic database are demonstrated. The first two examples comprise the verification of glacier velocities and the study of glacier retreat from the various input data-sets as well as the use of the DTM for climatological modelling. The last case study focuses on the use of the new digital database as a basic GIS (Geographic Information System) layer for environmental monitoring and management on King George Island.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

When we try to analyze and to control a system whose model was obtained only based on input/output data, accuracy is essential in the model. On the other hand, to make the procedure practical, the modeling stage must be computationally efficient. In this regard, this paper presents the application of extended Kalman filter for the parametric adaptation of a fuzzy model

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Light detection and ranging (LiDAR) technology is beginning to have an impact on agriculture. Canopy volume and/or fruit tree leaf area can be estimated using terrestrial laser sensors based on this technology. However, the use of these devices may have different options depending on the resolution and scanning mode. As a consequence, data accuracy and LiDAR derived parameters are affected by sensor configuration, and may vary according to vegetative characteristics of tree crops. Given this scenario, users and suppliers of these devices need to know how to use the sensor in each case. This paper presents a computer program to determine the best configuration, allowing simulation and evaluation of different LiDAR configurations in various tree structures (or training systems). The ultimate goal is to optimise the use of laser scanners in field operations. The software presented generates a virtual orchard, and then allows the scanning simulation with a laser sensor. Trees are created using a hidden Markov tree (HMT) model. Varying the foliar structure of the orchard the LiDAR simulation was applied to twenty different artificially created orchards with or without leaves from two positions (lateral and zenith). To validate the laser sensor configuration, leaf surface of simulated trees was compared with the parameters obtained by LiDAR measurements: the impacted leaf area, the impacted total area (leaves and wood), and th impacted area in the three outer layers of leaves.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Current Ambient Intelligence and Intelligent Environment research focuses on the interpretation of a subject’s behaviour at the activity level by logging the Activity of Daily Living (ADL) such as eating, cooking, etc. In general, the sensors employed (e.g. PIR sensors, contact sensors) provide low resolution information. Meanwhile, the expansion of ubiquitous computing allows researchers to gather additional information from different types of sensor which is possible to improve activity analysis. Based on the previous research about sitting posture detection, this research attempts to further analyses human sitting activity. The aim of this research is to use non-intrusive low cost pressure sensor embedded chair system to recognize a subject’s activity by using their detected postures. There are three steps for this research, the first step is to find a hardware solution for low cost sitting posture detection, second step is to find a suitable strategy of sitting posture detection and the last step is to correlate the time-ordered sitting posture sequences with sitting activity. The author initiated a prototype type of sensing system called IntelliChair for sitting posture detection. Two experiments are proceeded in order to determine the hardware architecture of IntelliChair system. The prototype looks at the sensor selection and integration of various sensor and indicates the best for a low cost, non-intrusive system. Subsequently, this research implements signal process theory to explore the frequency feature of sitting posture, for the purpose of determining a suitable sampling rate for IntelliChair system. For second and third step, ten subjects are recruited for the sitting posture data and sitting activity data collection. The former dataset is collected byasking subjects to perform certain pre-defined sitting postures on IntelliChair and it is used for posture recognition experiment. The latter dataset is collected by asking the subjects to perform their normal sitting activity routine on IntelliChair for four hours, and the dataset is used for activity modelling and recognition experiment. For the posture recognition experiment, two Support Vector Machine (SVM) based classifiers are trained (one for spine postures and the other one for leg postures), and their performance evaluated. Hidden Markov Model is utilized for sitting activity modelling and recognition in order to establish the selected sitting activities from sitting posture sequences.2. After experimenting with possible sensors, Force Sensing Resistor (FSR) is selected as the pressure sensing unit for IntelliChair. Eight FSRs are mounted on the seat and back of a chair to gather haptic (i.e., touch-based) posture information. Furthermore, the research explores the possibility of using alternative non-intrusive sensing technology (i.e. vision based Kinect Sensor from Microsoft) and find out the Kinect sensor is not reliable for sitting posture detection due to the joint drifting problem. A suitable sampling rate for IntelliChair is determined according to the experiment result which is 6 Hz. The posture classification performance shows that the SVM based classifier is robust to “familiar” subject data (accuracy is 99.8% with spine postures and 99.9% with leg postures). When dealing with “unfamiliar” subject data, the accuracy is 80.7% for spine posture classification and 42.3% for leg posture classification. The result of activity recognition achieves 41.27% accuracy among four selected activities (i.e. relax, play game, working with PC and watching video). The result of this thesis shows that different individual body characteristics and sitting habits influence both sitting posture and sitting activity recognition. In this case, it suggests that IntelliChair is suitable for individual usage but a training stage is required.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El modelo geométrico del geoide MGH44 es el resultado de una comparación directa entre mediciones GPS y de nivelación convencional sobre puntos de una red geodésica ubicada en la zona urbana de 50 km2 de la ciudad de Heredia, Costa Rica. Con la grilla del MGH44 se obtiene la ondulación del geoide para cualquier punto de esa zona, valor que se puede utilizar para estimar la altura sobre el nivel medio del mar a partir de mediciones de altura elipsoídica con GPS.En este documento se describen los procedimientosy cálculos realizados para evaluar la calidad vertical del modelo MGH44 por medio de la aplicación del estándar de la National Standard for Spatial Data Accuracy (NSSDA). A través de la generación de una nueva grilla, con solo 36 datos denominada MGH36, se obtuvieron nuevos valores de la ondulación del geoide para los restantes 20 puntos escogidos como control. En el procesamiento de la información se aplicaron diferentes algoritmos para corroborar si los datos de los 20 puntos de control siguen una distribución normal y, además, verificar que en este conjunto no se tuvieran errores groseros. El valor promedio de la ondulación del geoide de los puntos de control es de 14,287 m y el cálculo según el estándar de la NSSDA brindó una exactitud vertical de los datos de ± 0,045 m. Posteriormente,por medio de la técnica de Bootstrap, se calcularon con un 95% de probabilidad los valores 14,233 m y 14,353 m como límites del intervalo de confianza del promedio.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

When designing systems that are complex, dynamic and stochastic in nature, simulation is generally recognised as one of the best design support technologies, and a valuable aid in the strategic and tactical decision making process. A simulation model consists of a set of rules that define how a system changes over time, given its current state. Unlike analytical models, a simulation model is not solved but is run and the changes of system states can be observed at any point in time. This provides an insight into system dynamics rather than just predicting the output of a system based on specific inputs. Simulation is not a decision making tool but a decision support tool, allowing better informed decisions to be made. Due to the complexity of the real world, a simulation model can only be an approximation of the target system. The essence of the art of simulation modelling is abstraction and simplification. Only those characteristics that are important for the study and analysis of the target system should be included in the simulation model. The purpose of simulation is either to better understand the operation of a target system, or to make predictions about a target system’s performance. It can be viewed as an artificial white-room which allows one to gain insight but also to test new theories and practices without disrupting the daily routine of the focal organisation. What you can expect to gain from a simulation study is very well summarised by FIRMA (2000). His idea is that if the theory that has been framed about the target system holds, and if this theory has been adequately translated into a computer model this would allow you to answer some of the following questions: · Which kind of behaviour can be expected under arbitrarily given parameter combinations and initial conditions? · Which kind of behaviour will a given target system display in the future? · Which state will the target system reach in the future? The required accuracy of the simulation model very much depends on the type of question one is trying to answer. In order to be able to respond to the first question the simulation model needs to be an explanatory model. This requires less data accuracy. In comparison, the simulation model required to answer the latter two questions has to be predictive in nature and therefore needs highly accurate input data to achieve credible outputs. These predictions involve showing trends, rather than giving precise and absolute predictions of the target system performance. The numerical results of a simulation experiment on their own are most often not very useful and need to be rigorously analysed with statistical methods. These results then need to be considered in the context of the real system and interpreted in a qualitative way to make meaningful recommendations or compile best practice guidelines. One needs a good working knowledge about the behaviour of the real system to be able to fully exploit the understanding gained from simulation experiments. The goal of this chapter is to brace the newcomer to the topic of what we think is a valuable asset to the toolset of analysts and decision makers. We will give you a summary of information we have gathered from the literature and of the experiences that we have made first hand during the last five years, whilst obtaining a better understanding of this exciting technology. We hope that this will help you to avoid some pitfalls that we have unwittingly encountered. Section 2 is an introduction to the different types of simulation used in Operational Research and Management Science with a clear focus on agent-based simulation. In Section 3 we outline the theoretical background of multi-agent systems and their elements to prepare you for Section 4 where we discuss how to develop a multi-agent simulation model. Section 5 outlines a simple example of a multi-agent system. Section 6 provides a collection of resources for further studies and finally in Section 7 we will conclude the chapter with a short summary.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we present a method for estimating local thickness distribution in nite element models, applied to injection molded and cast engineering parts. This method features considerable improved performance compared to two previously proposed approaches, and has been validated against thickness measured by di erent human operators. We also demonstrate that the use of this method for assigning a distribution of local thickness in FEM crash simulations results in a much more accurate prediction of the real part performance, thus increasing the bene ts of computer simulations in engineering design by enabling zero-prototyping and thus reducing product development costs. The simulation results have been compared to experimental tests, evidencing the advantage of the proposed method. Thus, the proposed approach to consider local thickness distribution in FEM crash simulations has high potential on the product development process of complex and highly demanding injection molded and casted parts and is currently being used by Ford Motor Company.