960 resultados para QA76 Computer software


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work has as object the elaboration of social environmental indicator of disaster risk that are present in precarious areas of human occupation, related to intense environmental dynamic from the perspective of the studies about the subject in Geography. The District of Mãe Luiza in Natal, capital city of Rio Grande do Norte, was defined as the study area. The place was chosen because it presents –historically- several vulnerability conditions and exposure to disaster risk. After a local social environmental description, two indexes were elaborated: the Social Vulnerability Index (SVI or IVS in Portuguese), based on 17 (seventeen) variables arranged on a questionnaire addressed to the population nucleus of the district, on regular grid (systematic sampling), classified into 5 (five) levels of SV from the weighted average; and the Physical and Natural Exposure to the Mass Movements Index (EMMI or IEMM in Portuguese) which had 16 (sixteen) variables that feature conditions of exposure to the mass movements in the district with classified levels from the weighted average of 1 (one) to 5 (five). The relationship between these two results, specialized in the district map, produced the Social Environmental Vulnerability Index (SEVI or IVSA in Portuguese) of Mãe Luiza, also classified into 5 (five) levels, following the Boolean logic correlation for cartographic overlay with use of computer software ArcGIS v.9.3, being named as: Very Low; low; average; high; and Very High Environmental Vulnerability in District. The study is based on the methodology proposed by Guerra et al (2009) for EMMI and by Almeida (2010) for SVI. They were modified and adapted according to the local reality, producing a new methodology in this study area. It was concluded that the neighborhood has most of its area with High and Very High Socio-environmental vulnerability to disasters, defined seven (7) critical areas, with Very High IVSA, and hazards associated with mass movements or flooding. In the end, the main issues that were found, such as generating elements for proposing mitigation measures and/or the proposed interventions were enumerated, related to structural order of vulnerability factors: how low constructive pattern of households; poor urban drainage; Real Estate forsaken in landslide routes; infrastructure ready access roads and slope containment. And social: as a lack of education about environmental risk; income and education of residents; presence of persons with limited mobility and/or those with special needs. This reality highlights the need for urgent action applied to the resolution and/or reduction of these problems, which is focusing the end of this work.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objetivos: Este estudo tem como objetivo analisar as representações da Mulher no discurso científico, em particular nas Ciências Sociais e do Comportamento. Metodologia: Optámos por um estudo de caso com uma abordagem metodológica de análise de conteúdo qualitativa, com recursos a informações quantitativas utilizando o software informático MAXQDA 12. A análise foi desenvolvida a partir de uma grelha de categorização que delineámos. Amostra: A nossa amostra é composta por um conjunto de 122 artigos publicados em 9 revistas científicas portuguesas na área da Psicologia no ano de 2015. Resultados: Atráves da análise dos 122 artigos, no conjunto das 9 revistas, obtivemos uma predominência da categoria “Família”, seguida do “Género” e por posteriormente da “Agressão”. Em comparação com o número de artigos em que as categorias estão presentes, obtivemos algumas diferenças. A categoria “Família” é a mais frequente, depois “Feminino” e por fim a “Agressão”, sendo estas também as que tiveram em destaque em maior parte das revistas analisadas. Verificámos que em algumas revistas havia mais que uma categoria prevalecente. Destacamos também que mais de metade das categorias não revelaram predominância nem relevância ao longo do estudo. Conclusão: O dicurso em relação à Mulher nos estudo científicos das Ciências Sociais e do Comportamento continua a manter-se ligado às representações sociais tradicionais, associadas à família e ao feminino. Não verificámos uma verdadeira transformação nos Estudo da Mulher, mas antes um eco de outros discursos e representações societariamente aceites. / Objectives: This study aimed to analyse the representations of Women in scientific discourse, particularly in Social and Behavioural Sciences. Methodology: We developed a case study using a methodological approach with qualitative content analysis, with resources to quantitative information using computer software MAXQDA 12. The analysis was developed from a grid of categorization that we outlined. Sample: Our sample is composed of a set of 122 articles published in 9 Portuguese scientific journals in Psychology in 2015. Results: Through the analysis of 122 articles in the nine journals, we obtained a relevance of the "Family" category, followed by "Gender" and "Aggression". Comparing the number of articles in which the categories are present, we found some differences. "Family" is the most frequent, followed by "Female" and "Aggression", which are also the categories that were highlighted in most of the journals analysed. We found that in some journals had more than a prevailing category. We also note that more than half of the categories did not show dominance or relevance throughout the study. Conclusions: The discourse on women in scientific studies of Social and Behavioural Sciences continues to remain linked to traditional social representations associated with the family and the female. We not noticed a real change in the Women's Study, but an echo of other speeches and corporate aspects accepted representations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este proyecto estudia y compara las metodologías Bottom Up y Top Down, utilizadas en el desarrollo de productos dentro de un departamento de manufactura en un ambiente colaborativo -- Se desarrolló un producto mediante ambas metodologías, posteriormente se analizó su incidencia en el comportamiento de indicadores de gestión, que miden el desempeño de una organización -- Se destacan también los beneficios del Top Down en la manufactura de grandes ensambles, tomando como ejemplo un torno

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La Universidad EAFIT, en los últimos años, por medio de la realización de varias investigaciones, ha estado desarrollado una propuesta con la cual se busca definir los componentes tecnológicos que deben componer un ecosistema de aplicaciones educativas, con el fin de apalancar la adopción del modelo de ubicuidad en las instituciones de educación superior -- Por medio del grupo de investigación de desarrollo e innovación en Tecnologías de la Información y las Comunicaciones (GIDITIC) ha realizado la selección de los primeros componentes del ecosistema en trabajos de tesis de grado de anteriores investigaciones[1, 2] -- Adicionalmente, algunos trabajos realizados por el gobierno local de la Alcaldía de Medellín en su proyecto de Medellín Ciudad Inteligente[3], también realizó una selección de algunos componentes que son necesarios para la implementación del portal -- Ambas iniciativas coinciden en la inclusión de un componente de registro de actividades, conocido como \Sistema de almacenamiento de experiencias" (LRS) -- Dados estos antecedentes, se pretende realizar una implementación de un LRS que cumpla con los objetivos buscados en el proyecto de la Universidad, siguiendo estándares que permitan asegurar la interoperabilidad con los otros componentes del ecosistema de aplicaciones educativas

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Graphical User Interface (GUI) is an integral component of contemporary computer software. A stable and reliable GUI is necessary for correct functioning of software applications. Comprehensive verification of the GUI is a routine part of most software development life-cycles. The input space of a GUI is typically large, making exhaustive verification difficult. GUI defects are often revealed by exercising parts of the GUI that interact with each other. It is challenging for a verification method to drive the GUI into states that might contain defects. In recent years, model-based methods, that target specific GUI interactions, have been developed. These methods create a formal model of the GUI’s input space from specification of the GUI, visible GUI behaviors and static analysis of the GUI’s program-code. GUIs are typically dynamic in nature, whose user-visible state is guided by underlying program-code and dynamic program-state. This research extends existing model-based GUI testing techniques by modelling interactions between the visible GUI of a GUI-based software and its underlying program-code. The new model is able to, efficiently and effectively, test the GUI in ways that were not possible using existing methods. The thesis is this: Long, useful GUI testcases can be created by examining the interactions between the GUI, of a GUI-based application, and its program-code. To explore this thesis, a model-based GUI testing approach is formulated and evaluated. In this approach, program-code level interactions between GUI event handlers will be examined, modelled and deployed for constructing long GUI testcases. These testcases are able to drive the GUI into states that were not possible using existing models. Implementation and evaluation has been conducted using GUITAR, a fully-automated, open-source GUI testing framework.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of the challenges to biomedical engineers proposed by researchers in neuroscience is brain machine interaction. The nervous system communicates by interpreting electrochemical signals, and implantable circuits make decisions in order to interact with the biological environment. It is well known that Parkinson’s disease is related to a deficit of dopamine (DA). Different methods has been employed to control dopamine concentration like magnetic or electrical stimulators or drugs. In this work was automatically controlled the neurotransmitter concentration since this is not currently employed. To do that, four systems were designed and developed: deep brain stimulation (DBS), transmagnetic stimulation (TMS), Infusion Pump Control (IPC) for drug delivery, and fast scan cyclic voltammetry (FSCV) (sensing circuits which detect varying concentrations of neurotransmitters like dopamine caused by these stimulations). Some softwares also were developed for data display and analysis in synchronously with current events in the experiments. This allowed the use of infusion pumps and their flexibility is such that DBS or TMS can be used in single mode and other stimulation techniques and combinations like lights, sounds, etc. The developed system allows to control automatically the concentration of DA. The resolution of the system is around 0.4 µmol/L with time correction of concentration adjustable between 1 and 90 seconds. The system allows controlling DA concentrations between 1 and 10 µmol/L, with an error about +/- 0.8 µmol/L. Although designed to control DA concentration, the system can be used to control, the concentration of other substances. It is proposed to continue the closed loop development with FSCV and DBS (or TMS, or infusion) using parkinsonian animals models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the past several decades, thousands of otoliths, bivalve shells, and scales have been collected for the purposes of age determination and remain archived in European and North American fisheries laboratories. Advances in digital imaging and computer software combined with techniques developed by tree-ring scientists provide a means by which to extract additional levels of information in these calcified structures and generate annually resolved (one value per year), multidecadal time-series of population-level growth anomalies. Chemical and isotopic properties may also be extracted to provide additional information regarding the environmental conditions these organisms experienced.Given that they are exactly placed in time, chronologies can be directly compared to instrumental climate records, chronologies from other regions or species, or time-seriesof other biological phenomena. In this way, chronologies may be used to reconstruct historical ranges of environmental variability, identify climatic drivers of growth, establish linkages within and among species, and generate ecosystem-level indicators. Following the first workshop in Hamburg, Germany, in December 2014, the second workshop on Growth increment Chronologies in Marine Fish: climate-ecosystem interactions in the North Atlantic (WKGIC2) met at the Mediterranean Institute for Advanced Studies headquarters in Esporles, Spain, on 18–22 April 2016, chaired by Bryan Black (USA) and Christoph Stransky (Germany).Thirty-six participants from fifteen different countries attended. Objectives were to i) review the applications of chronologies developed from growth-increment widths in the hard parts (otoliths, shells, scales) of marine fish and bivalve species ii) review the fundamentals of crossdating and chronology development, iii) discuss assumptions and limitations of these approaches, iv) measure otolith growth-increment widths in image analysis software, v) learn software to statistically check increment dating accuracy, vi) generate a growth increment chronology and relate it to climate indices, and vii) initiate cooperative projects or training exercises to commence after the workshop.The workshop began with an overview of tree-ring techniques of chronology development, including a hands-on exercise in cross dating. Next, we discussed the applications of fish and bivalve biochronologies and the range of issues that could be addressed. We then reviewed key assumptions and limitations, especially those associated with short-lived species for which there are numerous and extensive otolith archives in European fisheries labs. Next, participants were provided with images of European plaice otoliths from the North Sea and taught to measure increment widths in image analysis software. Upon completion of measurements, techniques of chronology development were discussed and contrasted to those that have been applied for long-lived species. Plaice growth time-series were then related to environmental variability using the KNMI Climate Explorer. Finally, potential future collaborations and funding opportunities were discussed, and there was a clear desire to meet again to compare various statistical techniques for chronology development using a range existing fish, bivalve, and tree growth-increment datasets. Overall, we hope to increase the use of these techniques, and over the long term, develop networks of biochronologies for integrative analyses of ecosystem functioning and relationships to long-term climate variability and fishing pressure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The increased prevalence of iron deficiency among infants can be attributed to the consumption of an iron deficient diet or a diet that interferes with iron absorption at the critical time of infancy, among other factors. The gradual shift from breast milk to other foods and liquids is a transition period which greatly contributes to iron deficiency anaemia (IDA). The purpose of this research was to assess iron deficiency anaemia among infants aged six to nine months in Keiyo South Sub County. The specific objectives of this study were: to establish the prevalence of infants with iron deficiency anaemia and dietary iron intake among infants aged 6 to 9 months. The cross sectional study design was adopted in this survey. This study was conducted in three health facilities in Keiyo South Sub County. The infants were selected by use of a two stage cluster sampling procedure. Systematic random sampling was then used to select a total of 244 mothers and their infants. Eighty two (82) infants were selected from Kamwosor sub-district hospital and eighty one (81) from both Nyaru and Chepkorio health facilities. Interview schedules, 24-hour dietary recall and food frequency questionnaires were used for collection of dietary iron intake. Biochemical tests were carried out by use of the Hemo-control photochrometer at the health facilities. Infants whose hemoglobin levels were less than 11g/dl were considered anaemic. Further, peripheral blood smears were conducted to ascertain the type of nutritional anaemia. Data was analyzed using the Statistical Package for Social Sciences (SPSS) computer software version 17, 2009. Dietary iron intake was analyzed using the NutriSurvey 2007 computer software. Results indicated that the mean hemoglobin values were 11.3± 0.84 g/dl. Twenty one percent (21.7%) of the infants had anaemia and further 100% of peripheral blood smears indicated iron deficiency anaemia. Dietary iron intake was a predictor of iron deficiency anaemia in this study (t=-3.138; p=0.01). Iron deficiency anaemia was evident among infants in Keiyo South Sub County. The Ministry of Health should formulate and implement policies on screening for anaemia and ensure intensive nutrition education on iron rich diets during child welfare clinics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Predicting user behaviour enables user assistant services provide personalized services to the users. This requires a comprehensive user model that can be created by monitoring user interactions and activities. BaranC is a framework that performs user interface (UI) monitoring (and collects all associated context data), builds a user model, and supports services that make use of the user model. A prediction service, Next-App, is built to demonstrate the use of the framework and to evaluate the usefulness of such a prediction service. Next-App analyses a user's data, learns patterns, makes a model for a user, and finally predicts, based on the user model and current context, what application(s) the user is likely to want to use. The prediction is pro-active and dynamic, reflecting the current context, and is also dynamic in that it responds to changes in the user model, as might occur over time as a user's habits change. Initial evaluation of Next-App indicates a high-level of satisfaction with the service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A comprehensive user model, built by monitoring a user's current use of applications, can be an excellent starting point for building adaptive user-centred applications. The BaranC framework monitors all user interaction with a digital device (e.g. smartphone), and also collects all available context data (such as from sensors in the digital device itself, in a smart watch, or in smart appliances) in order to build a full model of user application behaviour. The model built from the collected data, called the UDI (User Digital Imprint), is further augmented by analysis services, for example, a service to produce activity profiles from smartphone sensor data. The enhanced UDI model can then be the basis for building an appropriate adaptive application that is user-centred as it is based on an individual user model. As BaranC supports continuous user monitoring, an application can be dynamically adaptive in real-time to the current context (e.g. time, location or activity). Furthermore, since BaranC is continuously augmenting the user model with more monitored data, over time the user model changes, and the adaptive application can adapt gradually over time to changing user behaviour patterns. BaranC has been implemented as a service-oriented framework where the collection of data for the UDI and all sharing of the UDI data are kept strictly under the user's control. In addition, being service-oriented allows (with the user's permission) its monitoring and analysis services to be easily used by 3rd parties in order to provide 3rd party adaptive assistant services. An example 3rd party service demonstrator, built on top of BaranC, proactively assists a user by dynamic predication, based on the current context, what apps and contacts the user is likely to need. BaranC introduces an innovative user-controlled unified service model of monitoring and use of personal digital activity data in order to provide adaptive user-centred applications. This aims to improve on the current situation where the diversity of adaptive applications results in a proliferation of applications monitoring and using personal data, resulting in a lack of clarity, a dispersal of data, and a diminution of user control.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The XSophe-Sophe-XeprView((R)) computer simulation software suite enables scientists to easily determine spin Hamiltonian parameters from isotropic, randomly oriented and single crystal continuous wave electron paramagnetic resonance (CW EPR) spectra from radicals and isolated paramagnetic metal ion centers or clusters found in metalloproteins, chemical systems and materials science. XSophe provides an X-windows graphical user interface to the Sophe programme and allows: creation of multiple input files, local and remote execution of Sophe, the display of sophelog (output from Sophe) and input parameters/files. Sophe is a sophisticated computer simulation software programme employing a number of innovative technologies including; the Sydney OPera HousE (SOPHE) partition and interpolation schemes, a field segmentation algorithm, the mosaic misorientation linewidth model, parallelization and spectral optimisation. In conjunction with the SOPHE partition scheme and the field segmentation algorithm, the SOPHE interpolation scheme and the mosaic misorientation linewidth model greatly increase the speed of simulations for most spin systems. Employing brute force matrix diagonalization in the simulation of an EPR spectrum from a high spin Cr(III) complex with the spin Hamiltonian parameters g(e) = 2.00, D = 0.10 cm(-1), E/D = 0.25, A(x) = 120.0, A(y) = 120.0, A(z) = 240.0 x 10(-4) cm(-1) requires a SOPHE grid size of N = 400 (to produce a good signal to noise ratio) and takes 229.47 s. In contrast the use of either the SOPHE interpolation scheme or the mosaic misorientation linewidth model requires a SOPHE grid size of only N = 18 and takes 44.08 and 0.79 s, respectively. Results from Sophe are transferred via the Common Object Request Broker Architecture (CORBA) to XSophe and subsequently to XeprView((R)) where the simulated CW EPR spectra (1D and 2D) can be compared to the experimental spectra. Energy level diagrams, transition roadmaps and transition surfaces aid the interpretation of complicated randomly oriented CW EPR spectra and can be viewed with a web browser and an OpenInventor scene graph viewer.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recently, morphometric measurements of the ascending aorta have been done with ECG-gated multidector computerized tomography (MDCT) to help the development of future novel transcatheter therapies (TCT); nevertheless, the variability of such measurements remains unknown. Thirty patients referred for ECG-gated CT thoracic angiography were evaluated. Continuous reformations of the ascending aorta, perpendicular to the centerline, were obtained automatically with a commercially available computer aided diagnosis (CAD). Then measurements of the maximal diameter were done with the CAD and manually by two observers (separately). Measurements were repeated one month later. The Bland-Altman method, Spearman coefficients, and a Wilcoxon signed-rank test were used to evaluate the variability, the correlation, and the differences between observers. The interobserver variability for maximal diameter between the two observers was up to 1.2 mm with limits of agreement [-1.5, +0.9] mm; whereas the intraobserver limits were [-1.2, +1.0] mm for the first observer and [-0.8, +0.8] mm for the second observer. The intraobserver CAD variability was 0.8 mm. The correlation was good between observers and the CAD (0.980-0.986); however, significant differences do exist (P<0.001). The maximum variability observed was 1.2 mm and should be considered in reports of measurements of the ascending aorta. The CAD is as reproducible as an experienced reader.