993 resultados para Software industry
Resumo:
The aim of this study was to determine the effect of using video analysis software on the interrater reliability of visual assessments of gait videos in children with cerebral palsy. Two clinicians viewed the same random selection of 20 sagittal and frontal video recordings of 12 children with cerebral palsy routinely acquired during outpatient rehabilitation clinics. Both observers rated these videos in a random sequence for each lower limb using the Observational Gait Scale, once with standard video software and another with video analysis software (Dartfish(®)) which can perform angle and timing measurements. The video analysis software improved interrater agreement, measured by weighted Cohen's kappas, for the total score (κ 0.778→0.809) and all of the items that required angle and/or timing measurements (knee position mid-stance κ 0.344→0.591; hindfoot position mid-stance κ 0.160→0.346; foot contact mid-stance κ 0.700→0.854; timing of heel rise κ 0.769→0.835). The use of video analysis software is an efficient approach to improve the reliability of visual video assessments.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
Newsletter produced by Department of Agriculture and Land Stewardship about the animal industry in Iowa.Previously titled Animal Industry News.
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.
Resumo:
Newsletter produced by Department of Agriculture and Land Stewardship about the animal industry in Iowa. Previously titled Animal Industry News.
Resumo:
Newsletter produced by Department of Agriculture and Land Stewardship about the animal industry in Iowa. Previously titled Animal Industry News.
Resumo:
Newsletter produced by Department of Agriculture and Land Stewardship about the animal industry in Iowa. Previously titled Animal Industry News.
Resumo:
Newsletter produced by Department of Agriculture and Land Stewardship about the animal industry in Iowa. Previously titled Animal Industry News. Previously titled Animal Industry News.
Resumo:
New economic geography models show that there may be a strong relationship between economic integration and the geographical concentration of industries. Nevertheless, this relationship is neither unique nor stable, and may follow a ?-shaped pattern in the long term. The aim of the present paper is to analyze the evolution of the geographical concentration of manufacturing across Spanish regions during the period 1856-1995. We construct several geographical concentration indices for different points in time over these 140 years. The analysis is carried out at two levels of aggregation, in regions corresponding to the NUTS-II and NUTS-III classifications. We confirm that the process of economic integration stimulated the geographical concentration of industrial activity. Nevertheless, the localization coefficients only started to fall after the beginning of the integration of the Spanish Economy into the international markets in the mid-70s, and this new path was not interrupted by Spain¿s entry in the European Union some years later
Resumo:
This paper analyzes the relationship between spatial density of economic activity and interregional differences in the productivity of industrial labour in Spain during the period 1860-1999. In the spirit of Ciccone and Hall (1996) and Ciccone (2002), we analyze the evolution of this relationship over the long term in Spain. Using data on the period 1860-1999 we show the existence of an agglomeration effect linking the density of economic activity with labour productivity in the industry. This effect was present since the beginning of the industrialization process in the middle of the 19th century but has been decreasing over time. The estimated elasticity of labour productivity with respect to employment density was close to 8% in the subperiod 1860-1900, reduces to a value of around 7% in the subperiod 1914-1930, to 4% in the subperiod 1965-1979 and becomes insignificant in the final subperiod 1985-1999. At the end of the period analyzed there is no evidence of the existence of net agglomeration effects in the industry. This result could be explained by an important increase in the congestion effects in large industrial metropolitan areas that would have compensated the centripetal or agglomeration forces at work. Furthermore, this result is also consistent with the evidence of a dispersion of industrial activity in Spain during the last decades.
Resumo:
[cat] Hi ha evidència que l'experiència es remunera diferentment segons la indústria. Proposem un model teòric que explica aquestes diferències. Suposem que la mobilitat de treballadors aporta coneixement extern a l'empresa i això augmenta la seva productivitat. Els resultats mostren que l'experiència és millor remunerada en les indústries amb costos de mobilitat baixos, amb molt aprenentatge (learning-by-doing) i alt nivell tecnològic. A més, trobem una relació en forma de U entre la remuneració de l'experiència i el nivell d'absorció de coneixement extern, la substitutibilitat entre diferents tipus de treballadors i la varietat de coneixement dins la indústria. Els resultats són consistents amb l'evidència que les indústries intensives en I and D remuneren millor l'experiència.
Resumo:
Newsletter produced by Department of Agriculture and Land Stewardship about the animal industry in Iowa. Previously titled Animal Industry News.
Resumo:
A large number of applications using manufactured nanoparticles of less than 100 nm are currently being introduced into industrial processes. There is an urgent need to evaluate the risks of these novel particles to ensure their safe production, handling, use, and disposal. However, today we lack even rudimentary knowledge about type and quantity of industrially used manufactured nanoparticles and the level of exposure in Swiss industry. The goal of this study was to evaluate the use of nanoparticles, the currently implemented safety measures, and the number of potentially exposed workers in all types of industry. To evaluate this, a targeted telephone survey was conducted among health and safety representatives from 197 Swiss companies. The survey showed that nanoparticles are already used in many industrial sectors; not only in companies in the new field of nanotechnology, but also in more traditional sectors, such as paints. Forty-three companies declared to use or produce nanoparticles, and 11 imported and traded with prepackaged goods that contain nanoparticles. The following nanoparticles were found to be used in considerable quantities (> 1000 kg/year per company): Ag, Al-Ox, Fe-Ox, SiO2, TiO2, and ZnO. The median reported quantity of handled nanoparticles was 100 kg/year. The production of cosmetics, food, paints, powders, and the treatment of surfaces used the largest quantities of these nanoparticles. Generally, the safety measures were found to be higher in powder-based than in liquid-based applications. However, the respondents had many open questions about best practices, which points to the need for rapid development of guidelines and protection strategies
Resumo:
Map of railroads and locations of new industries in Iowa, 1971-1980
Resumo:
Production flow analysis (PFA) is a well-established methodology used for transforming traditional functional layout into product-oriented layout. The method uses part routings to find natural clusters of workstations forming production cells able to complete parts and components swiftly with simplified material flow. Once implemented, the scheduling system is based on period batch control aiming to establish fixed planning, production and delivery cycles for the whole production unit. PFA is traditionally applied to job-shops with functional layouts, and after reorganization within groups lead times reduce, quality improves and motivation among personnel improves. Several papers have documented this, yet no research has studied its application to service operations management. This paper aims to show that PFA can well be applied not only to job-shop and assembly operations, but also to back-office and service processes with real cases. The cases clearly show that PFA reduces non-value adding operations, introduces flow by evening out bottlenecks and diminishes process variability, all of which contribute to efficient operations management.