998 resultados para Effective metrics


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este trabalho apresenta relações entre a produtividade do trabalho e as capacitações que ocorreram nas Organizações Militares (OM) de telemática do Exército Brasileiro (EB), que representam o Sistema de Telemática do Exército (SisTEx). O período do estudo se dá entre janeiro de 2010 e julho de 2011. O SisTEx é melhor caracterizado pelo Centro Integrado de Telemática do Exército (CITEx), pelos Centros de Telemática de Área (CTA) e pelos Centros de Telemática (CT), subordinados ao Departamento de Ciência e Tecnologia (DCT) e dispostos ao longo de todo o território nacional. O estudo trata do conceito de produtividade do trabalho e do processo de capacitação no SisTEx. Fala sobre as áreas do conhecimento de interesse sistêmico e das áreas estratégicas que devem ser atendidas com capacitações, mostrando os resultados que surgiram em função das capacitações realizadas. Propõe sugestões para alinhar as necessidades de capacitação com as áreas estratégicas, destacando a importância das capacitações no planejamento estratégico, passando pelos interesses individuais. Relaciona estratégias que representam um diferencial competitivo na agregação de valor aos usuários. Traz comentários sobre a utilização do ensino a distância(EAD) e presencial para realização das capacitações. Trata da interferência das capacitações na produtividade e na percepção sobre o retorno do investimento (ROI). Relaciona, ainda, as capacidades do SisTEx com os estudos de inovação tecnológica no setor de serviços. Destaca as capacitações realizadas na áreas da segurança da informação e defesa cibernética. Considera que é possível melhorar a produtividade do trabalho em função das capacitações que ocorrem no SisTEx, que contribuem como um vetor de modernidade e transformação que agem diretamente no processo produtivo, proporcionando assim uma aceleração no desenvolvimento da qualidade dos serviços de TI prestados. Traz recomendações de estudos futuros para verificar a velocidade de acumulação das capacidades tecnológicas, o uso do EAD para capacitações de maior complexidade técnica e a criação de métricas para cálculo efetivo do ROI. Para tal, foi feito um estudo bibliográfico sobre a produtividade do trabalho e o processo de capacitação do SisTEx. O método adotado foi o do estudo de caso. Foram feitos questionamentos (survey) e enquetes/votações (poll) que foram aplicados nos chefes, exchefes de CTA/CT e nos discentes do SisTEx, militares que realizaram capacitações no período considerado.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We investigate the causal structure of general nonlinear electrodynamics and determine which Lagrangians generate an effective metric conformal to Minkowski. We also prove that there is only one analytic nonlinear electrodynamics not presenting birefringence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There are several classes of homogeneous Fermi systems that are characterized by the topology of the energy spectrum of fermionic quasiparticles: (i) gapless systems with a Fermi surface, (ii) systems with a gap in their spectrum, (iii) gapless systems with topologically stable point nodes (Fermi points), and (iv) gapless systems with topologically unstable lines of nodes (Fermi lines). Superfluid 3He-A and electroweak vacuum belong to the universality class 3. The fermionic quasiparticles (particles) in this class are chiral: they are left-handed or right-handed. The collective bosonic modes of systems of class 3 are the effective gauge and gravitational fields. The great advantage of superfluid 3He-A is that we can perform experiments by using this condensed matter and thereby simulate many phenomena in high energy physics, including axial anomaly, baryoproduction, and magnetogenesis. 3He-A textures induce a nontrivial effective metrics of the space, where the free quasiparticles move along geodesics. With 3He-A one can simulate event horizons, Hawking radiation, rotating vacuum, etc. High-temperature superconductors are believed to belong to class 4. They have gapless fermionic quasiparticles with a “relativistic” spectrum close to gap nodes, which allows application of ideas developed for superfluid 3He-A.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].

Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.

As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.

More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.

With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.

Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.

With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.

Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.

Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The paper aims to describe a workforce-planning model developed in-house in an Australian university library that is based on rigorous environmental scanning of an institution, the profession and the sector. Design/methodology/approach – The paper uses a case study that describes the stages of the planning process undertaken to develop the Library’s Workforce Plan and the documentation produced. Findings – While it has been found that the process has had successful and productive outcomes, workforce planning is an ongoing process. To remain effective, the workforce plan needs to be reviewed annually in the context of the library’s overall planning program. This is imperative if the plan is to remain current and to be regarded as a living document that will continue to guide library practice. Research limitations/implications – Although a single case study, the work has been contextualized within the wider research into workforce planning. Practical implications – The paper provides a model that can easily be deployed within a library without external or specialist consultant skills, and due to its scalability can be applied at department or wider level. Originality/value – The paper identifies the trends impacting on, and the emerging opportunities for, university libraries and provides a model for workforce planning that recognizes the context and culture of the organization as key drivers in determining workforce planning. Keywords - Australia, University libraries, Academic libraries, Change management, Manpower planning Paper type - Case study

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective enterprise information security policy management requires review and assessment activities to ensure information security policies are aligned with business goals and objectives. As security policy management involves the elements of policy development process and the security policy as output, the context for security policy assessment requires goal-based metrics for these two elements. However, the current security management assessment methods only provide checklist types of assessment that are predefined by industry best practices and do not allow for developing specific goal-based metrics. Utilizing theories drawn from literature, this paper proposes the Enterprise Information Security Policy Assessment approach that expands on the Goal-Question-Metric (GQM) approach. The proposed assessment approach is then applied in a case scenario example to illustrate a practical application. It is shown that the proposed framework addresses the requirement for developing assessment metrics and allows for the concurrent undertaking of process-based and product-based assessment. Recommendations for further research activities include the conduct of empirical research to validate the propositions and the practical application of the proposed assessment approach in case studies to provide opportunities to introduce further enhancements to the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While enhanced cybersecurity options, mainly based around cryptographic functions, are needed overall speed and performance of a healthcare network may take priority in many circumstances. As such the overall security and performance metrics of those cryptographic functions in their embedded context needs to be understood. Understanding those metrics has been the main aim of this research activity. This research reports on an implementation of one network security technology, Internet Protocol Security (IPSec), to assess security performance. This research simulates sensitive healthcare information being transferred over networks, and then measures data delivery times with selected security parameters for various communication scenarios on Linux-based and Windows-based systems. Based on our test results, this research has revealed a number of network security metrics that need to be considered when designing and managing network security for healthcare-specific or non-healthcare-specific systems from security, performance and manageability perspectives. This research proposes practical recommendations based on the test results for the effective selection of network security controls to achieve an appropriate balance between network security and performance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motivation behind the fusion of Intrusion Detection Systems was the realization that with the increasing traffic and increasing complexity of attacks, none of the present day stand-alone Intrusion Detection Systems can meet the high demand for a very high detection rate and an extremely low false positive rate. Multi-sensor fusion can be used to meet these requirements by a refinement of the combined response of different Intrusion Detection Systems. In this paper, we show the design technique of sensor fusion to best utilize the useful response from multiple sensors by an appropriate adjustment of the fusion threshold. The threshold is generally chosen according to the past experiences or by an expert system. In this paper, we show that the choice of the threshold bounds according to the Chebyshev inequality principle performs better. This approach also helps to solve the problem of scalability and has the advantage of failsafe capability. This paper theoretically models the fusion of Intrusion Detection Systems for the purpose of proving the improvement in performance, supplemented with the empirical evaluation. The combination of complementary sensors is shown to detect more attacks than the individual components. Since the individual sensors chosen detect sufficiently different attacks, their result can be merged for improved performance. The combination is done in different ways like (i) taking all the alarms from each system and avoiding duplications, (ii) taking alarms from each system by fixing threshold bounds, and (iii) rule-based fusion with a priori knowledge of the individual sensor performance. A number of evaluation metrics are used, and the results indicate that there is an overall enhancement in the performance of the combined detector using sensor fusion incorporating the threshold bounds and significantly better performance using simple rule-based fusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a generalisation of the k-nearest neighbour (k-NN) retrieval method based on an error function using distance metrics in the solution and problem space. It is an interpolative method which is proposed to be effective for sparse case bases. The method applies equally to nominal, continuous and mixed domains, and does not depend upon an embedding n-dimensional space. In continuous Euclidean problem domains, the method is shown to be a generalisation of the Shepard's Interpolation method. We term the retrieval algorithm the Generalised Shepard Nearest Neighbour (GSNN) method. A novel aspect of GSNN is that it provides a general method for interpolation over nominal solution domains. The performance of the retrieval method is examined with reference to the Iris classification problem,and to a simulated sparse nominal value test problem. The introducion of a solution-space metric is shown to out-perform conventional nearest neighbours methods on sparse case bases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Puisque l’altération des habitats d’eau douce augmente, il devient critique d’identifier les composantes de l’habitat qui influencent les métriques de la productivité des pêcheries. Nous avons comparé la contribution relative de trois types de variables d’habitat à l’explication de la variance de métriques d’abondance, de biomasse et de richesse à l’aide de modèles d’habitat de poissons, et avons identifié les variables d’habitat les plus efficaces à expliquer ces variations. Au cours des étés 2012 et 2013, les communautés de poissons de 43 sites littoraux ont été échantillonnées dans le Lac du Bonnet, un réservoir dans le Sud-est du Manitoba (Canada). Sept scénarios d’échantillonnage, différant par l’engin de pêche, l’année et le moment de la journée, ont été utilisés pour estimer l’abondance, la biomasse et la richesse à chaque site, toutes espèces confondues. Trois types de variables d’habitat ont été évalués: des variables locales (à l’intérieur du site), des variables latérales (caractérisation de la berge) et des variables contextuelles (position relative à des attributs du paysage). Les variables d’habitat locales et contextuelles expliquaient en moyenne un total de 44 % (R2 ajusté) de la variation des métriques de la productivité des pêcheries, alors que les variables d’habitat latérales expliquaient seulement 2 % de la variation. Les variables les plus souvent significatives sont la couverture de macrophytes, la distance aux tributaires d’une largeur ≥ 50 m et la distance aux marais d’une superficie ≥ 100 000 m2, ce qui suggère que ces variables sont les plus efficaces à expliquer la variation des métriques de la productivité des pêcheries dans la zone littorale des réservoirs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The high-throughput experimental data from the new gene microarray technology has spurred numerous efforts to find effective ways of processing microarray data for revealing real biological relationships among genes. This work proposes an innovative data pre-processing approach to identify noise data in the data sets and eliminate or reduce the impact of the noise data on gene clustering, With the proposed algorithm, the pre-processed data sets make the clustering results stable across clustering algorithms with different similarity metrics, the important information of genes and features is kept, and the clustering quality is improved. The primary evaluation on real microarray data sets has shown the effectiveness of the proposed algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose:  The purpose of the study was to obtain anterior segment biometry for 40 normal eyes and to measure variables that may be useful to design large diameter gas permeable contact lenses that sit outside the region normally viewed by corneal topographers. Also, the distribution of these variables in the normal eye and how well they correlated to each other were determined.

Methods:  This is a cross-sectional study, in which data were collected at a single study visit. Corneal topography and imaging of the anterior segment of the eye were performed using the Orbscan II and Visante OCT. The variables that were collected were horizontal K reading, central corneal/scleral sagittal depth at 15 mm chord, and nasal and temporal angles at the 15 mm chord using the built-in software measurement tools.

Results:  The central horizontal K readings for the 40 eyes were 43 ± 1.73 D (7.85 ± 0.31 mm), with ± 95% confidence interval (CI) of 38.7 (8.7 mm) and 46.6 D (7.24 mm). The mean corneal/scleral sagittal depth at the 15 mm chord was 3.74 ± 0.19 mm and the range was 3.14 to 4.04 mm. The average nasal angle (which was not different from the temporal angle) at the 15 mm chord was 39.32 ± 3.07 degrees and the ± 95%CI was 33.7 and 45.5 degrees. The correlation coefficient comparing the K reading and the corneal/scleral sagittal depth showed the best correlation (0.58, p < 0.001). The corneal/scleral sagittal depth at 15 mm correlated less with the nasal angle (0.44, p = 0.004) and the weakest correlation was for the nasal angle at 15 mm with the horizontal readings (0.32, p = 0.046).

Conclusion:  The Visante OCT is a valuable tool for imaging the anterior segment of the eye. The Visante OCT is especially effective in providing the biometry of the peripheral cornea and sclera and may help in fitting GP lenses with a higher percentage of initial lens success, when the corneal sag and lens sag are better matched.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Legal academics are not only teachers but also creators of knowledge. The role of an academic includes a responsibility to share this knowledge through engagement not just of their students, but also of the wider community. In addition, there is increasing emphasis on legal academics having to account for the so-called ‘impact’ of research. In selecting both the topic of their research and the mode of publication of their knowledge, legal academics act as gatekeepers. There is an increasing critique of the existing paradigm of research publication and its emphasis on the metrics of impact. This critique recognises the limitations of the commercial publication paradigm in the present context of open access and the vast array of citizen-mediated platforms for dissemination of legal knowledge and innovation. Susskind (Tomorrow’s Lawyers 2013) for example identifies expert crowd-sourced legal information as breaking down barriers to access to justice. Tracking their experience with publication of a paper on social media in legal education from the ALTA conference in 2012, the authors share an auto-ethnographic account of their insights into the potential for both impact and engagement of a diverse audience in their research. This highlights the ways in which various media can be used strategically to redefine the role of the gatekeeper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Content-based image retrieval is still a challenging issue due to the inherent complexity of images and choice of the most discriminant descriptors. Recent developments in the field have introduced multidimensional projections to burst accuracy in the retrieval process, but many issues such as introduction of pattern recognition tasks and deeper user intervention to assist the process of choosing the most discriminant features still remain unaddressed. In this paper, we present a novel framework to CBIR that combines pattern recognition tasks, class-specific metrics, and multidimensional projection to devise an effective and interactive image retrieval system. User interaction plays an essential role in the computation of the final multidimensional projection from which image retrieval will be attained. Results have shown that the proposed approach outperforms existing methods, turning out to be a very attractive alternative for managing image data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Development of desalination projects requires simple methodologies and tools for cost-effective and environmentally-sensitive management. Sentinel taxa and biotic indices are easily interpreted in the perspective of environment management. Echinoderms are potential sentinel taxon to gauge the impact produced by brine discharge and the BOPA index is considered an effective tool for monitoring different types of impact. Salinity increase due to desalination brine discharge was evaluated in terms of these two indicators. They reflected the environmental impact and recovery after implementation of a mitigation measure. Echinoderms disappeared at the station closest to the discharge during the years with highest salinity and then recovered their abundance after installation of a diffuser reduced the salinity increase. In the same period, BOPA responded due to the decrease in sensitive amphipods and the increase in tolerant polychaete families when salinities rose. Although salinity changes explained most of the observed variability in both indicators, other abiotic parameters were also significant in explaining this variability.