943 resultados para Localization accuracy metrics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The strain-induced self-assembly of suitable semiconductor pairs is an attractive natural route to nanofabrication. To bring to fruition their full potential for actual applications, individual nanostructures need to be combined into ordered patterns in which the location of each single unit is coupled with others and the surrounding environment. Within the Ge/Si model system, we analyze a number of examples of bottom-up strategies in which the shape, positioning, and actual growth mode of epitaxial nanostructures are tailored by manipulating the intrinsic physical processes of heteroepitaxy. The possibility of controlling elastic interactions and, hence, the configuration of self-assembled quantum dots by modulating surface orientation with the miscut angle is discussed. We focus on the use of atomic steps and step bunching as natural templates for nanodot clustering. Then, we consider several different patterning techniques which allow one to harness the natural self-organization dynamics of the system, such as: scanning tunneling nanolithography, focused ion beam and nanoindentation patterning. By analyzing the evolution of the dot assembly by scanning probe microscopy, we follow the pathway which leads to lateral ordering, discussing the thermodynamic and kinetic effects involved in selective nucleation on patterned substrates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kinematic models are commonly used to quantify foot and ankle kinematics, yet no marker sets or models have been proven reliable or accurate when wearing shoes. Further, the minimal detectable difference of a developed model is often not reported. We present a kinematic model that is reliable, accurate and sensitive to describe the kinematics of the foot–shoe complex and lower leg during walking gait. In order to achieve this, a new marker set was established, consisting of 25 markers applied on the shoe and skin surface, which informed a four segment kinematic model of the foot–shoe complex and lower leg. Three independent experiments were conducted to determine the reliability, accuracy and minimal detectable difference of the marker set and model. Inter-rater reliability of marker placement on the shoe was proven to be good to excellent (ICC = 0.75–0.98) indicating that markers could be applied reliably between raters. Intra-rater reliability was better for the experienced rater (ICC = 0.68–0.99) than the inexperienced rater (ICC = 0.38–0.97). The accuracy of marker placement along each axis was <6.7 mm for all markers studied. Minimal detectable difference (MDD90) thresholds were defined for each joint; tibiocalcaneal joint – MDD90 = 2.17–9.36°, tarsometatarsal joint – MDD90 = 1.03–9.29° and the metatarsophalangeal joint – MDD90 = 1.75–9.12°. These thresholds proposed are specific for the description of shod motion, and can be used in future research designed at comparing between different footwear.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective quantification of three-dimensional kinematics during different functional and occupational tasks is now more in demand than ever. The introduction of new generation of low-cost passive motion capture systems from a number of manufacturers has made this technology accessible for teaching, clinical practice and in small/medium industry. Despite the attractive nature of these systems, their accuracy remains unproved in independent tests. We assessed static linear accuracy, dynamic linear accuracy and compared gait kinematics from a Vicon MX20 system to a Natural Point OptiTrack system. In all experiments data were sampled simultaneously. We identified both systems perform excellently in linear accuracy tests with absolute errors not exceeding 1%. In gait data there was again strong agreement between the two systems in sagittal and coronal plane kinematics. Transverse plane kinematics differed by up to 3 at the knee and hip, which we attributed to the impact of soft tissue artifact accelerations on the data. We suggest that low-cost systems are comparably accurate to their high-end competitors and offer a platform with accuracy acceptable in research for laboratories with a limited budget.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The accuracy of marker placement on palpable surface anatomical landmarks is an important consideration in biomechanics. Although marker placement reliability has been studied in some depth, it remains unclear whether or not the markers are accurately positioned over the intended landmark in order to define the static position and orientation of the segment. A novel method using commonly available X-ray imaging was developed to identify the accuracy of markers placed on the shoe surface by palpating landmarks through the shoe. An anterior–posterior and lateral–medial X-ray was taken on 24 participants with a newly developed marker set applied to both the skin and shoe. The vector magnitude of both skin- and shoe-mounted markers from the anatomical landmark was calculated, as well as the mean marker offset between skin- and shoe-mounted markers. The accuracy of placing markers on the shoe relative to the skin-mounted markers, accounting for shoe thickness, was less than 5mm for all markers studied. Further, when using the developed guidelines provided in this study, the method was deemed reliable (Intra-rater ICCs¼0.50–0.92). In conclusion, the method proposed here can reliably assess marker placement accuracy on the shoe surface relative to chosen anatomical landmarks beneath the skin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Insect monitoring and sampling programmes are used in the stored grains industry for the detection and estimation of insect pests. At the low pest densities dictated by economic and commercial requirements, the accuracy of both detection and abundance estimates can be influenced by variations in the spatial structure of pest populations over short distances. Geostatistical analysis of Rhyzopertha dominica populations in 2 dimensions showed that, in both the horizontal and vertical directions and at all temperatures examined, insect numbers were positively correlated over short (0-5cm) distances, and negatively correlated over longer (≥10cm) distances. Analysis in 3 dimensions showed a similar pattern, with positive correlations over short distances and negative correlations at longer distances. At 35°C, insects were located significantly further from the grain surface than at 25 and 30°C. Dispersion metrics showed statistically significant aggregation in all cases. This is the first research using small sample units, high sampling intensities, and a range of temperatures, to show spatial structuring of R. dominica populations over short distances. This research will have significant implications for sampling in the stored grains industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction The suitability of video conferencing (VC) technology for clinical purposes relevant to geriatric medicine is still being established. This project aimed to determine the validity of the diagnosis of dementia via VC. Methods This was a multisite, noninferiority, prospective cohort study. Patients, aged 50 years and older, referred by their primary care physician for cognitive assessment, were assessed at 4 memory disorder clinics. All patients were assessed independently by 2 specialist physicians. They were allocated one face-to-face (FTF) assessment (Reference standard – usual clinical practice) and an additional assessment (either usual FTF assessment or a VC assessment) on the same day. Each specialist physician had access to the patient chart and the results of a battery of standardized cognitive assessments administered FTF by the clinic nurse. Percentage agreement (P0) and the weighted kappa statistic with linear weight (Kw) were used to assess inter-rater reliability across the 2 study groups on the diagnosis of dementia (cognition normal, impaired, or demented). Results The 205 patients were allocated to group: Videoconference (n = 100) or Standard practice (n = 105); 106 were men. The average age was 76 (SD 9, 51–95) and the average Standardized Mini-Mental State Examination Score was 23.9 (SD 4.7, 9–30). Agreement for the Videoconference group (P0= 0.71; Kw = 0.52; P < .0001) and agreement for the Standard Practice group (P0= 0.70; Kw = 0.50; P < .0001) were both statistically significant (P < .05). The summary kappa statistic of 0.51 (P = .84) indicated that VC was not inferior to FTF assessment. Conclusions Previous studies have shown that preliminary standardized assessment tools can be reliably administered and scored via VC. This study focused on the geriatric assessment component of the interview (interpretation of standardized assessments, taking a history and formulating a diagnosis by medical specialist) and identified high levels of agreement for diagnosing dementia. A model of service incorporating either local or remote administered standardized assessments, and remote specialist assessment, is a reliable process for enabling the diagnosis of dementia for isolated older adults.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effective enterprise information security policy management requires review and assessment activities to ensure information security policies are aligned with business goals and objectives. As security policy management involves the elements of policy development process and the security policy as output, the context for security policy assessment requires goal-based metrics for these two elements. However, the current security management assessment methods only provide checklist types of assessment that are predefined by industry best practices and do not allow for developing specific goal-based metrics. Utilizing theories drawn from literature, this paper proposes the Enterprise Information Security Policy Assessment approach that expands on the Goal-Question-Metric (GQM) approach. The proposed assessment approach is then applied in a case scenario example to illustrate a practical application. It is shown that the proposed framework addresses the requirement for developing assessment metrics and allows for the concurrent undertaking of process-based and product-based assessment. Recommendations for further research activities include the conduct of empirical research to validate the propositions and the practical application of the proposed assessment approach in case studies to provide opportunities to introduce further enhancements to the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proteasomes can exist in several different molecular forms in mammalian cells. The core 20S proteasome, containing the proteolytic sites, binds regulatory complexes at the ends of its cylindrical structure. Together with two 19S ATPase regulatory complexes it forms the 26S proteasome, which is involved in ubiquitin-dependent proteolysis. The 20S proteasome can also bind 11S regulatory complexes (REG, PA28) which play a role in antigen processing, as do the three variable c-interferoninducible catalytic b-subunits (e.g. LMP7). In the present study, we have investigated the subcellular distribution of the different forms of proteasomes using subunit speci®c antibodies. Both 20S proteasomes and their 19S regulatory complexes are found in nuclear, cytosolic and microsomal preparations isolated from rat liver. LMP7 was enriched approximately two-fold compared with core a-type proteasome subunits in the microsomal preparations. 20S proteasomes were more abundant than 26S proteasomes, both in liver and cultured cell lines. Interestingly, some signi®cant differences were observed in the distribution of different subunits of the 19S regulatory complexes. S12, and to a lesser extent p45, were found to be relatively enriched in nuclear fractions from rat liver, and immuno¯uorescent labelling of cultured cells with anti-p45 antibodies showed stronger labelling in the nucleus than in the cytoplasm. The REG was found to be localized predominantly in the cytoplasm. Three- to six-fold increases in the level of REG were observed following cinterferon treatment of cultured cells but c-interferon had no obvious effect on its subcellular distribution. These results demonstrate that different regulatory complexes and subpopulations of proteasomes have different distributions within mammalian cells and, therefore, that the distribution is more complex than has been reported for yeast proteasomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Utilizing a mono-specific antiserum produced in rabbits to hog kidney aromatic L-amino acid decarboxylase (AADC), the enzyme was localized in rat kidney by immunoperoxidase staining. AADC was located predominantly in the proximal convoluted tubules; there was also weak staining in the distal convoluted tubules and collecting ducts. An increase in dietary potassium or sodium intake produced no change in density or distribution of AADC staining in kidney. An assay of AADC enzyme activity showed no difference in cortex or medulla with chronic potassium loading. A change in distribution or activity of renal AADC does not explain the postulated dopaminergic modulation of renal function that occurs with potassium or sodium loading.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Appearance-based localization can provide loop closure detection at vast scales regardless of accumulated metric error. However, the computation time and memory requirements of current appearance-based methods scale not only with the size of the environment but also with the operation time of the platform. Additionally, repeated visits to locations will develop multiple competing representations, which will reduce recall performance over time. These properties impose severe restrictions on long-term autonomy for mobile robots, as loop closure performance will inevitably degrade with increased operation time. In this paper we present a graphical extension to CAT-SLAM, a particle filter-based algorithm for appearance-based localization and mapping, to provide constant computation and memory requirements over time and minimal degradation of recall performance during repeated visits to locations. We demonstrate loop closure detection in a large urban environment with capped computation time and memory requirements and performance exceeding previous appearance-based methods by a factor of 2. We discuss the limitations of the algorithm with respect to environment size, appearance change over time and applications in topological planning and navigation for long-term robot operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Appearance-based localization is increasingly used for loop closure detection in metric SLAM systems. Since it relies only upon the appearance-based similarity between images from two locations, it can perform loop closure regardless of accumulated metric error. However, the computation time and memory requirements of current appearance-based methods scale linearly not only with the size of the environment but also with the operation time of the platform. These properties impose severe restrictions on longterm autonomy for mobile robots, as loop closure performance will inevitably degrade with increased operation time. We present a set of improvements to the appearance-based SLAM algorithm CAT-SLAM to constrain computation scaling and memory usage with minimal degradation in performance over time. The appearance-based comparison stage is accelerated by exploiting properties of the particle observation update, and nodes in the continuous trajectory map are removed according to minimal information loss criteria. We demonstrate constant time and space loop closure detection in a large urban environment with recall performance exceeding FAB-MAP by a factor of 3 at 100% precision, and investigate the minimum computational and memory requirements for maintaining mapping performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This unique book reveals the procedural aspects of knowledge-based urban planning, development and assessment. Concentrating on major knowledge city building processes, and providing state-of-the-art experiences and perspectives, this important compendium explores innovative models, approaches and lessons learnt from a number of key case studies across the world. Many cities worldwide, in order to brand themselves as knowledge cities, have undergone major transformations in the 21st century. This book provides a thorough understanding of these transformations and the key issues in building prosperous knowledge cities by focusing particularly on the policy-making, planning process and performance assessment aspects. The contributors reveal theoretical and conceptual foundations of knowledge cities and their development approach of knowledge-based urban development. They present best-practice examples from a number of key case studies across the globe. This important book provides readers with a thorough understanding of the key issues in planning and developing prosperous knowledge cities of the knowledge economy era, which will prove invaluable to national, state/regional and city governments’ planning and development departments. Academics, postgraduate and undergraduate students of regional and urban studies will also find this path-breaking book an intriguing read.