950 resultados para Analysis in tablets


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. The purpose of this study is to examine the benefit of adding mfVEP hemifield Intersector analysis protocol to the standard HFA test when there is suspicious glaucomatous visual field loss. 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2, optical coherence tomography of the optic nerve head, and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. The retinal nerve fibre (RNFL) thickness was recorded to identify subjects with suspicious RNFL loss. The hemifield Intersector analysis of mfVEP results showed that signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the 3 groups (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 in glaucoma suspect group (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. The use of SAP and mfVEP results in subjects with suspicious glaucomatous visual field defects, identified by low RNFL thickness, is beneficial in confirming early visual field defects. The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol in addition to SAP analysis can provide information about focal visual field differences across the horizontal midline, and confirm suspicious field defects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. The Intersector analysis protocol can detect early field changes not detected by standard HFA test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62-04, 62H30, 62J20

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the major problems for Critical Discourse Analysts is how to move on from their insightful critical analyses to successfully 'acting on the world in order to transform it'. This paper discusses, with detailed exemplification, some of the areas where linguists have moved beyond description to acting on and changing the world. Examples from three murder trials show how essential it is, in order to protect the rights of witnesses and defendants, to have audio records of significant interviews with police officers. The article moves on to discuss the potentially serious consequences of the many communicative problems inherent in legal/lay interaction and illustrates a few of the linguist-led improvements to important texts. Finally, the article turns to the problems of using linguistic data to try to determine the geographical origin of asylum seekers. The intention of the article is to act as a call to arms to linguists; it concludes with the observation that 'innumerable mountains remain for those with a critical linguistic perspective who would like to try to move one'. © 2011 John Benjamins Publishing Company.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Non-parametric methods for efficiency evaluation were designed to analyse industries comprising multi-input multi-output producers and lacking data on market prices. Education is a typical example. In this chapter, we review applications of DEA in secondary and tertiary education, focusing on the opportunities that this offers for benchmarking at institutional level. At secondary level, we investigate also the disaggregation of efficiency measures into pupil-level and school-level effects. For higher education, while many analyses concern overall institutional efficiency, we examine also studies that take a more disaggregated approach, centred either around the performance of specific functional areas or that of individual employees.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Skin blood microcirculation and the metabolism activity of tissue were examined on the patients with type 2 diabetes. Laser Doppler flowmetry (LDF) with 1064 nm laser light source and fluorescence spectroscopy (FS) with excitation light of 365 nm and 450 nm have been used to monitor the blood perfusion and the content of coenzymes NADH and FAD. Concluding, the proposed combined LDF and tissue FS approach allows to identify the significant violations in the blood microcirculation and metabolic activity for type 2 diabetes patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our aim was to approach an important and well-investigable phenomenon – connected to a relatively simple but real field situation – in such a way, that the results of field observations could be directly comparable with the predictions of a simulation model-system which uses a simple mathematical apparatus and to simultaneously gain such a hypothesis-system, which creates the theoretical opportunity for a later experimental series of studies. As a phenomenon of the study, we chose the seasonal coenological changes of aquatic and semiaquatic Heteroptera community. Based on the observed data, we developed such an ecological model-system, which is suitable for generating realistic patterns highly resembling to the observed temporal patterns, and by the help of which predictions can be given to alternative situations of climatic circumstances not experienced before (e.g. climate changes), and furthermore; which can simulate experimental circumstances. The stable coenological state-plane, which was constructed based on the principle of indirect ordination is suitable for unified handling of data series of monitoring and simulation, and also fits for their comparison. On the state-plane, such deviations of empirical and model-generated data can be observed and analysed, which could otherwise remain hidden.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A XX. század második felében lejátszódó nyelvi fordulat révén egy újfajta posztmodern irányzat jelent meg a társadalomtudományok területén. A diskurzuselméleti megközelítésmód sajátos nézőpontjával az 1990-es évekre már diszciplínateremtő igénnyel is fellépett. Az új tudományos szemlélet számos társadalomtudományi területen jelent meg. Jelen tanulmány a különböző elméleti irányzatokat és a hozzájuk kapcsolódó módszereket tekinti át, és célja a diskurzuselemzés alkalmazhatóságának bemutatása a politika- és a vezetéstudományok területén. ____ Due to a new linguistic revolution occurred in the second half of the twentieth century new post modern tendencies appeared in the field of social sciences. Discourse theory with its unique perspective has succeeded in building up a discipline at the 1990s. This new scientific approach has appeared in several social sciences. This study surveys different theoretical trends and related methods and aims to demonstrate the applicability of discourse analysis in the field of political and management sciences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, global economic performance largely depends on digital ecosystems. E-commerce, cloud, social media, sharing economy are the main products of the modern innovative economic systems which are constantly raising new regulatory questions. Meanwhile the United States has an unimpeachable dominance in innovation and new technologies, as well as a large and open domestic market, the EU is only recently discovering the importance of empowering the European digital economy and aims to break down its highly fragmented cross-border online economic environment. As global economy is rapidly becoming digital, Europe’s effort to create and invest in common digital market is understandable. The comprehensive investigations launched by the European Commission into the role of social network, search engine, or sharing economy internet platforms, which are new generation technologies dominated by American firms; or the recent decision of the Court of Justice of the European Union declaring that the Commission’s US Safe Harbor Decision is invalid1 might be considered as part of an anti-American protectionist policy. However, these measures could rather be seen as part of a broader trend to foster European enterprises in technology developments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The health status of wild and captive Atlantic Bottlenose dolphins ( Tersiops truncatis) is difficult to ascertain. Mass strandings of these animals have been attributed to pollutants, as well as bacterial infections. Using human Enzyme Linked Immuno-Assays (ELISA) for immunological cytokines, I measured soluble cytokine levels with respect to their health status. In a retrospective analysis of dolphin sera, there was a trend of higher cytokine levels in “sick” animals. I cultured dolphin lymphocytes in the presence of a mitogen (PHA), a super antigen (Staph-A), Lipopolysaccharide (LPS), and a calcium flux inducer (PMA). Levels of messenger RNA, from these cultured cells, were assayed with Polymerase Chain Reaction (PCR) using primers for the human cytokines IL-2, IL-4, IL-6, IL-10, Tumor Necrosis Factor, and Interferon gamma. Only IL-4, IL-6, and IL-10 messages were obtained, inferring similar nucleotide homology to the human primer sequences. The PCR products were sequenced. Sixteen IL-4 sequences, twelve IL-6 sequences and seven IL-10 sequences were obtained and analyzed. Each cytokine exhibited the same nucleotide sequence in all dolphins examined. There was no difference in the cytokine profile in response to the various stimuli. The derived amino acid composition for each of the dolphin cytokines was used for molecular modeling, which showed that dolphin IL-4, IL-6, and IL-10 were structurally similar to the corresponding proteins of Perissodactyla. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With advances in science and technology, computing and business intelligence (BI) systems are steadily becoming more complex with an increasing variety of heterogeneous software and hardware components. They are thus becoming progressively more difficult to monitor, manage and maintain. Traditional approaches to system management have largely relied on domain experts through a knowledge acquisition process that translates domain knowledge into operating rules and policies. It is widely acknowledged as a cumbersome, labor intensive, and error prone process, besides being difficult to keep up with the rapidly changing environments. In addition, many traditional business systems deliver primarily pre-defined historic metrics for a long-term strategic or mid-term tactical analysis, and lack the necessary flexibility to support evolving metrics or data collection for real-time operational analysis. There is thus a pressing need for automatic and efficient approaches to monitor and manage complex computing and BI systems. To realize the goal of autonomic management and enable self-management capabilities, we propose to mine system historical log data generated by computing and BI systems, and automatically extract actionable patterns from this data. This dissertation focuses on the development of different data mining techniques to extract actionable patterns from various types of log data in computing and BI systems. Four key problems—Log data categorization and event summarization, Leading indicator identification , Pattern prioritization by exploring the link structures , and Tensor model for three-way log data are studied. Case studies and comprehensive experiments on real application scenarios and datasets are conducted to show the effectiveness of our proposed approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Engineering analysis in geometric models has been the main if not the only credible/reasonable tool used by engineers and scientists to resolve physical boundaries problems. New high speed computers have facilitated the accuracy and validation of the expected results. In practice, an engineering analysis is composed of two parts; the design of the model and the analysis of the geometry with the boundary conditions and constraints imposed on it. Numerical methods are used to resolve a large number of physical boundary problems independent of the model geometry. The time expended due to the computational process are related to the imposed boundary conditions and the well conformed geometry. Any geometric model that contains gaps or open lines is considered an imperfect geometry model and major commercial solver packages are incapable of handling such inputs. Others packages apply different kinds of methods to resolve this problems like patching or zippering; but the final resolved geometry may be different from the original geometry, and the changes may be unacceptable. The study proposed in this dissertation is based on a new technique to process models with geometrical imperfection without the necessity to repair or change the original geometry. An algorithm is presented that is able to analyze the imperfect geometric model with the imposed boundary conditions using a meshfree method and a distance field approximation to the boundaries. Experiments are proposed to analyze the convergence of the algorithm in imperfect models geometries and will be compared with the same models but with perfect geometries. Plotting results will be presented for further analysis and conclusions of the algorithm convergence

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Classification procedures, including atmospheric correction satellite images as well as classification performance utilizing calibration and validation at different levels, have been investigated in the context of a coarse land-cover classification scheme for the Pachitea Basin. Two different correction methods were tested against no correction in terms of reflectance correction towards a common response for pseudo-invariant features (PIF). The accuracy of classifications derived from each of the three methods was then assessed in a discriminant analysis using crossvalidation at pixel, polygon, region, and image levels. Results indicate that only regression adjusted images using PIFs show no significant difference between images in any of the bands. A comparison of classifications at different levels suggests though that at pixel, polygon, and region levels the accuracy of the classifications do not significantly differ between corrected and uncorrected images. Spatial patterns of land-cover were analyzed in terms of colonization history, infrastructure, suitability of the land, and landownership. The actual use of the land is driven mainly by the ability to access the land and markets as is obvious in the distribution of land cover as a function of distance to rivers and roads. When considering all rivers and roads a threshold distance at which disproportional agro-pastoral land cover switches from over represented to under represented is at about 1km. Best land use suggestions seem not to affect the choice of land use. Differences in abundance of land cover between watersheds are more prevailing than differences between colonist and indigenous groups.