947 resultados para Indicator Component Framework (ICF)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigated a new performance indicator to assess climbing fluency (smoothness of the hip trajectory and orientation of a climber using normalized jerk coefficients) to explore effects of practice and hold design on performance. Eight experienced climbers completed four repetitions of two, 10-m high routes with similar difficulty levels, but varying in hold graspability (holds with one edge vs holds with two edges). An inertial measurement unit was attached to the hips of each climber to collect 3D acceleration and 3D orientation data to compute jerk coefficients. Results showed high correlations (r = .99, P < .05) between the normalized jerk coefficient of hip trajectory and orientation. Results showed higher normalized jerk coefficients for the route with two graspable edges, perhaps due to more complex route finding and action regulation behaviors. This effect decreased with practice. Jerk coefficient of hip trajectory and orientation could be a useful indicator of climbing fluency for coaches as its computation takes into account both spatial and temporal parameters (ie, changes in both climbing trajectory and time to travel this trajectory)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The timing of widespread continental emergence is generally considered to have had a dramatic effect on the hydrological cycle, atmospheric conditions, and climate. New secondary ion mass spectrometry (SIMS) oxygen and laser-ablation–multicollector–inductively coupled plasma–mass spectrometry (LA-MC-ICP-MS) Lu-Hf isotopic results from dated zircon grains in the granitic Neoarchean Rum Jungle Complex provide a minimum time constraint on the emergence of continental crust above sea level for the North Australian craton. A 2535 ± 7 Ma monzogranite is characterized by magmatic zircon with slightly elevated δ18O (6.0‰–7.5‰ relative to Vienna standard mean ocean water [VSMOW]), consistent with some contribution to the magma from reworked supracrustal material. A supracrustal contribution to magma genesis is supported by the presence of metasedimentary rock enclaves, a large population of inherited zircon grains, and subchondritic zircon Hf (εHf = −6.6 to −4.1). A separate, distinct crustal source to the same magma is indicated by inherited zircon grains that are dominated by low δ18O values (2.5‰–4.8‰, n = 9 of 15) across a range of ages (3536–2598 Ma; εHf = −18.2 to +0.4). The low δ18O grains may be the product of one of two processes: (1) grain-scale diffusion of oxygen in zircon by exchange with a low δ18O magma or (2) several episodes of magmatic reworking of a Mesoarchean or older low δ18O source. Both scenarios require shallow crustal magmatism in emergent crust, to allow interaction with rocks altered by hydrothermal meteoric water in order to generate the low δ18O zircon. In the first scenario, assimilation of these altered rocks during Neoarchean magmatism generated low δ18O magma with which residual detrital zircons were able to exchange oxygen, while preserving their U-Pb systematics. In the second scenario, wholesale melting of the altered rocks occurred in several distinct events through the Mesoarchean, generating low δ18O magma from which zircon crystallized. Ultimately, in either scenario, the low δ18O zircons were entrained as inherited grains in a Neoarchean granite. The data suggest operation of a modern hydrological cycle by the Neoarchean and add to evidence for the increased emergence of continents by this time

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasingly larger scale applications are generating an unprecedented amount of data. However, the increasing gap between computation and I/O capacity on High End Computing machines makes a severe bottleneck for data analysis. Instead of moving data from its source to the output storage, in-situ analytics processes output data while simulations are running. However, in-situ data analysis incurs much more computing resource contentions with simulations. Such contentions severely damage the performance of simulation on HPE. Since different data processing strategies have different impact on performance and cost, there is a consequent need for flexibility in the location of data analytics. In this paper, we explore and analyze several potential data-analytics placement strategies along the I/O path. To find out the best strategy to reduce data movement in given situation, we propose a flexible data analytics (FlexAnalytics) framework in this paper. Based on this framework, a FlexAnalytics prototype system is developed for analytics placement. FlexAnalytics system enhances the scalability and flexibility of current I/O stack on HEC platforms and is useful for data pre-processing, runtime data analysis and visualization, as well as for large-scale data transfer. Two use cases – scientific data compression and remote visualization – have been applied in the study to verify the performance of FlexAnalytics. Experimental results demonstrate that FlexAnalytics framework increases data transition bandwidth and improves the application end-to-end transfer performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An intrinsic challenge associated with evaluating proposed techniques for detecting Distributed Denial-of-Service (DDoS) attacks and distinguishing them from Flash Events (FEs) is the extreme scarcity of publicly available real-word traffic traces. Those available are either heavily anonymised or too old to accurately reflect the current trends in DDoS attacks and FEs. This paper proposes a traffic generation and testbed framework for synthetically generating different types of realistic DDoS attacks, FEs and other benign traffic traces, and monitoring their effects on the target. Using only modest hardware resources, the proposed framework, consisting of a customised software traffic generator, ‘Botloader’, is capable of generating a configurable mix of two-way traffic, for emulating either large-scale DDoS attacks, FEs or benign traffic traces that are experimentally reproducible. Botloader uses IP-aliasing, a well-known technique available on most computing platforms, to create thousands of interactive UDP/TCP endpoints on a single computer, each bound to a unique IP-address, to emulate large numbers of simultaneous attackers or benign clients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Structural equation modeling (SEM) is a powerful statistical approach for the testing of networks of direct and indirect theoretical causal relationships in complex data sets with intercorrelated dependent and independent variables. SEM is commonly applied in ecology, but the spatial information commonly found in ecological data remains difficult to model in a SEM framework. Here we propose a simple method for spatially explicit SEM (SE-SEM) based on the analysis of variance/covariance matrices calculated across a range of lag distances. This method provides readily interpretable plots of the change in path coefficients across scale and can be implemented using any standard SEM software package. We demonstrate the application of this method using three studies examining the relationships between environmental factors, plant community structure, nitrogen fixation, and plant competition. By design, these data sets had a spatial component, but were previously analyzed using standard SEM models. Using these data sets, we demonstrate the application of SE-SEM to regularly spaced, irregularly spaced, and ad hoc spatial sampling designs and discuss the increased inferential capability of this approach compared with standard SEM. We provide an R package, sesem, to easily implement spatial structural equation modeling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pattern recognition is a promising approach for the identification of structural damage using measured dynamic data. Much of the research on pattern recognition has employed artificial neural networks (ANNs) and genetic algorithms as systematic ways of matching pattern features. The selection of a damage-sensitive and noise-insensitive pattern feature is important for all structural damage identification methods. Accordingly, a neural networks-based damage detection method using frequency response function (FRF) data is presented in this paper. This method can effectively consider uncertainties of measured data from which training patterns are generated. The proposed method reduces the dimension of the initial FRF data and transforms it into new damage indices and employs an ANN method for the actual damage localization and quantification using recognized damage patterns from the algorithm. In civil engineering applications, the measurement of dynamic response under field conditions always contains noise components from environmental factors. In order to evaluate the performance of the proposed strategy with noise polluted data, noise contaminated measurements are also introduced to the proposed algorithm. ANNs with optimal architecture give minimum training and testing errors and provide precise damage detection results. In order to maximize damage detection results, the optimal architecture of ANN is identified by defining the number of hidden layers and the number of neurons per hidden layer by a trial and error method. In real testing, the number of measurement points and the measurement locations to obtain the structure response are critical for damage detection. Therefore, optimal sensor placement to improve damage identification is also investigated herein. A finite element model of a two storey framed structure is used to train the neural network. It shows accurate performance and gives low error with simulated and noise-contaminated data for single and multiple damage cases. As a result, the proposed method can be used for structural health monitoring and damage detection, particularly for cases where the measurement data is very large. Furthermore, it is suggested that an optimal ANN architecture can detect damage occurrence with good accuracy and can provide damage quantification with reasonable accuracy under varying levels of damage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We developed a theoretical framework to organize obesity prevention interventions by their likely impact on the socioeconomic gradient of weight. The degree to which an intervention involves individual agency versus structural change influences socioeconomic inequalities in weight. Agentic interventions, such as standalone social marketing, increase socioeconomic inequalities. Structural interventions, such as food procurement policies and restrictions on unhealthy foods in schools, show equal or greater benefit for lower socioeconomic groups. Many obesity prevention interventions belong to the agento–structural types of interventions, and account for the environment in which health behaviors occur, but they require a level of individual agency for behavioral change, including workplace design to encourage exercise and fiscal regulation of unhealthy foods or beverages. Obesity prevention interventions differ in their effectiveness across socioeconomic groups. Limiting further increases in socioeconomic inequalities in obesity requires implementation of structural interventions. Further empirical evaluation, especially of agento–structural type interventions, remains crucial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissecting how genetic and environmental influences impact on learning is helpful for maximizing numeracy and literacy. Here we show, using twin and genome-wide analysis, that there is a substantial genetic component to children’s ability in reading and mathematics, and estimate that around one half of the observed correlation in these traits is due to shared genetic effects (so-called Generalist Genes). Thus, our results highlight the potential role of the learning environment in contributing to differences in a child’s cognitive abilities at age twelve.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction & aims The demand for evidence of efficacy of treatments in general and orthopaedic surgical procedures in particular is ever increasing in Australia and worldwide. The aim of this study is to share the key elements of an evaluation framework recently implemented in Australia to determine the efficacy of bone-anchored prostheses. Method The proposed evaluation framework to determine the benefit and harms of bone-anchored prostheses for individuals with limb loss was extracted from a systematic review of the literature including seminal studies focusing on clinical benefits and safety of procedures involving screw-type implant (e.g., OPRA) and press-fit fixations (e.g., EEFT, ILP, OPL). [1-64] Results The literature review highlighted that a standard and replicable evaluation framework should focus on: • The clinical benefits with a systematic recording of health-related quality of life (e.g., SF-26, Q-TFA), mobility predictor (e.g., AMPRO), ambulation abilities (e.g., TUG, 6MWT), walking abilities (e.g., characteristic spatio-temporal) and actual activity level at baseline and follow-up post Stage 2 surgery, • The potential harms with systematic recording of residuum care, infection, implant stability, implant integrity, injuries (e.g., falls) after Stage 1 surgery. There was a general consensus around the instruments to monitor most of the benefits and harms. The benefits could be assessed using a wide spectrum of complementary assessments ranging from subjective patient self-reporting to objective measurements of physical activity. However, this latter was assessed using a broad range of measurements (e.g., pedometer, load cell, energy consumption). More importantly, the lack of consistent grading of infections was sufficiently noticeable to impede cross-fixation comparisons. Clearly, a more universal grading system is needed. Conclusions Investigators are encouraged to implement an evaluation framework featuring the domains and instruments proposed above using a single database to facilitate robust prospective studies about potential benefits and harms of their procedure. This work is also a milestone in the development of national and international clinical outcome registries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gene expression is arguably the most important indicator of biological function. Thus identifying differentially expressed genes is one of the main aims of high throughout studies that use microarray and RNAseq platforms to study deregulated cellular pathways. There are many tools for analysing differentia gene expression from transciptomic datasets. The major challenge of this topic is to estimate gene expression variance due to the high amount of ‘background noise’ that is generated from biological equipment and the lack of biological replicates. Bayesian inference has been widely used in the bioinformatics field. In this work, we reveal that the prior knowledge employed in the Bayesian framework also helps to improve the accuracy of differential gene expression analysis when using a small number of replicates. We have developed a differential analysis tool that uses Bayesian estimation of the variance of gene expression for use with small numbers of biological replicates. Our method is more consistent when compared to the widely used cyber-t tool that successfully introduced the Bayesian framework to differential analysis. We also provide a user-friendly web based Graphic User Interface for biologists to use with microarray and RNAseq data. Bayesian inference can compensate for the instability of variance caused when using a small number of biological replicates by using pseudo replicates as prior knowledge. We also show that our new strategy to select pseudo replicates will improve the performance of the analysis. - See more at: http://www.eurekaselect.com/node/138761/article#sthash.VeK9xl5k.dpuf

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nepal, as a consequence of its geographical location and changing climate, faces frequent threats of natural disasters. According to the World Bank’s 2005 Natural Disasters Hotspots Report, Nepal is ranked the 11th most vulnerable country to earthquake and 30th to flood risk. Geo-Hazards International (2011) has classified Kathmandu as one of the world’s most vulnerable cities to earthquakes. In the last four decades more than 32,000 people in Nepal have lost their lives and annual monetary loss is estimated at more than 15 million (US) dollars. This review identifies gaps in knowledge, and progress towards implementation of the Post Hyogo Framework of Action. Nepal has identified priority areas: community resilience, sustainable development and climate change induced disaster risk reduction. However, one gap between policy and action lies in the ability of Nepal to act effectively in accordance with an appropriate framework for media activities. Supporting media agencies include the Press Council, Federation of Nepalese Journalists, Nepal Television, Radio Nepal and Telecommunications Authority and community based organizations. The challenge lies in further strengthening traditional and new media to undertake systematic work supported by government bodies and the National Risk Reduction Consortium (NRRC). Within this context, the ideal role for media is one that is proactive where journalists pay attention to a range of appropriate angles or frames when preparing and disseminating information. It is important to develop policy for effective information collection, sharing and dissemination in collaboration with Telecommunication, Media and Journalists. The aim of this paper is to describe the developments in disaster management in Nepal and their implications for media management. This study provides lessons for government, community and the media to help improve the framing of disaster messages. Significantly, the research highlights the prominence that should be given to flood, landslides, lightning and earthquakes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Shared eHealth records systems offer promising benefits for improving healthcare through high availability of information and improved decision making; however, their uptake has been hindered by concerns over the privacy of patient information. To address these privacy concerns while balancing the requirements of healthcare professionals to have access to the information they need to provide appropriate care, the use of an Information Accountability Framework (IAF) has been proposed. For the IAF and so called Accountable-eHealth systems to become a reality, the framework must provide for a diverse range of users and use cases. The initial IAF model did not provide for more diverse use cases including the need for certain users to delegate access to another user in the system to act on their behalf while maintaining accountability. In this paper, we define the requirements for delegation of access in the IAF, how such access policies would be represented in the Framework, and implement and validate an expanded IAF model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a robust method for mosaicing of document images using features derived from connected components. Each connected component is described using the Angular Radial Tran. form (ART). To ensure geometric consistency during feature matching, the ART coefficients of a connected component are augmented with those of its two nearest neighbors. The proposed method addresses two critical issues often encountered in correspondence matching: (i) The stability of features and (ii) Robustness against false matches due to the multiple instances of characters in a document image. The use of connected components guarantees a stable localization across images. The augmented features ensure a successful correspondence matching even in the presence of multiple similar regions within the page. We illustrate the effectiveness of the proposed method on camera captured document images exhibiting large variations in viewpoint, illumination and scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two inorganic-organic hybrid framework iron phosphate-oxalates, I, [N2C4H12](0.5)[Fe-2(HPO4)(C2O4)(1.5)] and II, [Fe-2(OH2)PO4(C2O4)(0.5)] have been synthesized by hydrothermal means and the structures determined by X-ray crystallography. Crystal Data: compound I, monoclinic, spacegroup = P2(1)/c (No. 14), a=7.569(2) Angstrom, b=7.821(2) Angstrom, c=18.033(4) Angstrom, beta=98.8(1)degrees, V=1055.0(4) Angstrom(3), Z=4, M=382.8, D-calc=2.41 g cm(-3) MoK alpha, R-F=0.02; compound II, monoclinic, spacegroup=P2(1)/c (No. 14), a=10.240(1) b=6.375(3) Angstrom, 9.955(1) Angstrom, beta=117.3(1)degrees, V=577.4(1) Angstrom(3), Z=4, M=268.7, D-calc=3.09 g cm(-3) MoK alpha, R-F=0.03. These materials contain a high proportion of three-coordinated oxygens and [Fe2O9] dimeric units, besides other interesting structural features. The connectivity of Fe2O9 is entirely different in the two materials resulting in the formation of a continuous chain of Fe-O-Fe in II. The phosphate-oxalate containing the amine, I, forms well-defined channels. Magnetic susceptibility measurements show Fen to be in the high-spin state (t(2g)(4)e(g)(2)) in II, and in the intermediate-spin state (t(2g)(5)e(g)(1)) in I.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The random early detection (RED) technique has seen a lot of research over the years. However, the functional relationship between RED performance and its parameters viz,, queue weight (omega(q)), marking probability (max(p)), minimum threshold (min(th)) and maximum threshold (max(th)) is not analytically availa ble. In this paper, we formulate a probabilistic constrained optimization problem by assuming a nonlinear relationship between the RED average queue length and its parameters. This problem involves all the RED parameters as the variables of the optimization problem. We use the barrier and the penalty function approaches for its Solution. However (as above), the exact functional relationship between the barrier and penalty objective functions and the optimization variable is not known, but noisy samples of these are available for different parameter values. Thus, for obtaining the gradient and Hessian of the objective, we use certain recently developed simultaneous perturbation stochastic approximation (SPSA) based estimates of these. We propose two four-timescale stochastic approximation algorithms based oil certain modified second-order SPSA updates for finding the optimum RED parameters. We present the results of detailed simulation experiments conducted over different network topologies and network/traffic conditions/settings, comparing the performance of Our algorithms with variants of RED and a few other well known adaptive queue management (AQM) techniques discussed in the literature.