936 resultados para reliable narrator


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To determine whether the ‘through-focus’ aberrations of a multifocal and accommodative intraocular lens (IOL) implanted patient can be used to provide rapid and reliable measures of their subjective range of clear vision. Methods: Eyes that had been implanted with a concentric (n = 8), segmented (n = 10) or accommodating (n = 6) intraocular lenses (mean age 62.9 ± 8.9 years; range 46-79 years) for over a year underwent simultaneous monocular subjective (electronic logMAR test chart at 4m with letters randomised between presentations) and objective (Aston open-field aberrometer) defocus curve testing for levels of defocus between +1.50 to -5.00DS in -0.50DS steps, in a randomised order. Pupil size and ocular aberration (a combination of the patient’s and the defocus inducing lens aberrations) at each level of blur was measured by the aberrometer. Visual acuity was measured subjectively at each level of defocus to determine the traditional defocus curve. Objective acuity was predicted using image quality metrics. Results: The range of clear focus differed between the three IOL types (F=15.506, P=0.001) as well as between subjective and objective defocus curves (F=6.685, p=0.049). There was no statistically significant difference between subjective and objective defocus curves in the segmented or concentric ring MIOL group (P>0.05). However a difference was found between the two measures and the accommodating IOL group (P<0.001). Mean Delta logMAR (predicted minus measured logMAR) across all target vergences was -0.06 ± 0.19 logMAR. Predicted logMAR defocus curves for the multifocal IOLs did not show a near vision addition peak, unlike the subjective measurement of visual acuity. However, there was a strong positive correlation between measured and predicted logMAR for all three IOLs (Pearson’s correlation: P<0.001). Conclusions: Current subjective procedures are lengthy and do not enable important additional measures such as defocus curves under differently luminance or contrast levels to be assessed, which may limit our understanding of MIOL performance in real-world conditions. In general objective aberrometry measures correlated well with the subjective assessment indicating the relative robustness of this technique in evaluating post-operative success with segmented and concentric ring MIOL.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thèse numérisée par la Direction des bibliothèques de l'Université de Montréal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microneedles (MNs) are emerging devices that can be used for the delivery of drugs at specific locations1. Their performance is primarily judged by different features and the penetration through tissue is one of the most important aspects to evaluate. For detailed studies of MN performance different kind of in-vitro, exvivo and in-vivo tests should be performed. The main limitation of some of these tests is that biological tissue is too heterogeneous, unstable and difficult to obtain. In addition the use of biological materials sometimes present legal issues. There are many studies dealing with artificial membranes for drug diffusion2, but studies of artificial membranes for Microneedle mechanical characterization are scarce3. In order to overcome these limitations we have developed tests using synthetic polymeric membranes instead of biological tissue. The selected artificial membrane is homogeneous, stable, and readily available. This material is mainly composed of a roughly equal blend of a hydrocarbon wax and a polyolefin and it is commercially available under the brand name Parafilm®. The insertion of different kind of MN arrays prepared from crosslinked polymers were performed using this membrane and correlated with the insertion of the MN arrays in ex-vivo neonatal porcine skin. The insertion depth of the MNs was evaluated using Optical coherence tomography (OCT). The implementation of MN transdermal patches in the market can be improved by make this product user-friendly and easy to use. Therefore, manual insertion is preferred to other kind of procedures. Consequently, the insertion studies were performed in neonatal porcine skin and the artificial membrane using a manual insertion force applied by human volunteers. The insertion studies using manual forces correlated very well with the same studies performed with a Texture Analyzer equipment. These synthetic membranes seem to mimic closely the mechanical properties of the skin for the insertion of MNs using different methods of insertion. In conclusion, this artificial membrane substrate offers a valid alternative to biological tissue for the testing of MN insertion and can be a good candidate for developing a reliable quality control MN insertion test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In cardiovascular disease the definition and the detection of the ECG parameters related to repolarization dynamics in post MI patients is still a crucial unmet need. In addition, the use of a 3D sensor in the implantable medical devices would be a crucial mean in the assessment or prediction of Heart Failure status, but the inclusion of such feature is limited by hardware and firmware constraints. The aim of this thesis is the definition of a reliable surrogate of the 500 Hz ECG signal to reach the aforementioned objective. To evaluate the worsening of reliability due to sampling frequency reduction on delineation performance, the signals have been consecutively down sampled by a factor 2, 4, 8 thus obtaining the ECG signals sampled at 250, 125 and 62.5 Hz, respectively. The final goal is the feasibility assessment of the detection of the fiducial points in order to translate those parameters into meaningful clinical parameter for Heart Failure prediction, such as T waves intervals heterogeneity and variability of areas under T waves. An experimental setting for data collection on healthy volunteers has been set up at the Bakken Research Center in Maastricht. A 16 – channel ambulatory system, provided by TMSI, has recorded the standard 12 – Leads ECG, two 3D accelerometers and a respiration sensor. The collection platform has been set up by the TMSI property software Polybench, the data analysis of such signals has been performed with Matlab. The main results of this study show that the 125 Hz sampling rate has demonstrated to be a good candidate for a reliable detection of fiducial points. T wave intervals proved to be consistently stable, even at 62.5 Hz. Further studies would be needed to provide a better comparison between sampling at 250 Hz and 125 Hz for areas under the T waves.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first of the four paths that structure the book "Los girasoles ciegos", by Alberto Méndez – with its theory about the delayed end of the Spanish Civil War as a longing for destruction outside of any kind of strategy based on military logic – presents the construction of a memory based on certain oral marks, facts provided quietly by apparently non-central characters, the distrust of written documents, and the use of speech patterns mostly associated with spontaneity in order to set up a level of verisimilitude which makes the memory emerge in parallel pathways considered relatively reliable (for example, the case of a report), forged speech on the basis of indirect references, testimonials and letters. The aim of the paper is to consider an example of contemporary Spanish narrative in which a journey, perhaps weak in terms of the material, support to the channels through which the narrator comes in the story through the voice of the people – but functional as an approach to a search of the recent past – contributes to a certain conception of memory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work introduces a tessellation-based model for the declivity analysis of geographic regions. The analysis of the relief declivity, which is embedded in the rules of the model, categorizes each tessellation cell, with respect to the whole considered region, according to the (positive, negative, null) sign of the declivity of the cell. Such information is represented in the states assumed by the cells of the model. The overall configuration of such cells allows the division of the region into subregions of cells belonging to a same category, that is, presenting the same declivity sign. In order to control the errors coming from the discretization of the region into tessellation cells, or resulting from numerical computations, interval techniques are used. The implementation of the model is naturally parallel since the analysis is performed on the basis of local rules. An immediate application is in geophysics, where an adequate subdivision of geographic areas into segments presenting similar topographic characteristics is often convenient.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Diversity Advisory Committee (DAC) will discuss the dynamics of the process of assessing the diversity health at the University of Maryland Libraries. From designing the survey instrument through analyzing the results to the final writing of the report of diversity and inclusion, the committee members will unveil their challenges and achievements in presenting unbiased conclusions from this assessment project. In completing this project, the committee consulted the university’s wisdom, including (1) the College of Information Studies for creating the survey; (2) the Office of Institutional Research, Planning and Assessment (IRPA), and Division of Information Technology (DIT) for analyzing the results; and (3) the Campus Assessment Working Group (CAWG) model for organizing the content of the final report.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presentation at the CRIS2016 conference in St Andrews, June 10, 2016

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagnostic techniques based on PCR have two major problems: false-positive reactions due to contamination with DNA fragments from previous PCRs (amplicons) and false-negative reactions caused by inhibitors that interfere with the PCR. We have improved our previously reported PCR based on the amplification of a fragment of the Mycobacterium tuberculosis complex-specific insertion element IS6110 with respect to both problems. False-positive reactions caused by amplicon contamination were prevented by the use of uracil-N-glycosylase and dUTP instead of dTTP. We selected a new set of primers outside the region spanned by the formerly used primers to avoid false-positive reactions caused by dTTP-containing amplicons still present in the laboratory. With this new primer set, 16 copies of the IS6110 insertion element, the equivalent of two bacteria, could be amplified 10(10) times in 40 cycles, resulting in a mean efficiency of 77% per cycle. To detect the presence of inhibitors of the Taq polymerase, which may cause false-negative reactions, part of each sample was spiked with M. tuberculosis DNA. The DNA purification method using guanidinium thiocyanate and diatoms effectively removed most or all inhibitors of the PCR. However, this was not suitable for blood samples, for which we developed a proteinase K treatment followed by phenol-chloroform extraction. This method permitted detection of 20 M. tuberculosis bacteria per ml of whole blood. Various laboratory procedures were introduced to reduce failure or inhibition of PCR and avoid DNA cross contamination. We have tested 218 different clinical specimens obtained from patients suspected of having tuberculosis. The samples included sputum (n=145), tissue biopsy samples (n=25), cerebrospinal fluid (n=15), blood (n=14), pleural fluid (n=9), feces, (n=7), fluid from fistulae (n=2), and pus from a wound (n=1). The results obtained by PCR were consistent with those obtained with culture, which is the "gold standard." We demonstrate that PCR is a useful technique for the rapid diagnosis of tuberculosis at various sites.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Central Highlands region has a unique climate that presents both challenges and novel farming systems opportunities for cotton production. We have been re-examining the Emerald climate in a bid to identify opportunities that might enable the production of more consistent cotton yields and quality in what can be a highly variable climate. A detailed climatic analysis identified that spring and early summer is the most optimal period for boll growth and maturation. However, to unlock this potential requires unseasonal winter sowing that is 4 to 6 weeks earlier than the traditional mid-September sowing. Our experiments have sought answers to two questions: i) how much earlier can cotton be sown for reliable crop establishment and high yield; ii) can degradable plastic film mulches minimise the impact of potentially cold temperatures on crop establishment and early vigour. Initial data suggests August sowing offers the potential to grow a high yield at a time of year with reduced risk of cloud and high night temperatures during boll growth. For the past two seasons late winter sowing (with and without film) has resulted in a compact plant with high retention that physiologically matures by the beginning of January. Even with the spectre of replanting cotton in some seasons due to frost in August, early sowing would appear to offer the opportunity for more efficient crop input usage, simplified agronomic management and new crop rotation options during late summer and autumn. This talk will present an overview of results to date.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the semiconductor industry struggles to maintain its momentum down the path following the Moore's Law, three dimensional integrated circuit (3D IC) technology has emerged as a promising solution to achieve higher integration density, better performance, and lower power consumption. However, despite its significant improvement in electrical performance, 3D IC presents several serious physical design challenges. In this dissertation, we investigate physical design methodologies for 3D ICs with primary focus on two areas: low power 3D clock tree design, and reliability degradation modeling and management. Clock trees are essential parts for digital system which dissipate a large amount of power due to high capacitive loads. The majority of existing 3D clock tree designs focus on minimizing the total wire length, which produces sub-optimal results for power optimization. In this dissertation, we formulate a 3D clock tree design flow which directly optimizes for clock power. Besides, we also investigate the design methodology for clock gating a 3D clock tree, which uses shutdown gates to selectively turn off unnecessary clock activities. Different from the common assumption in 2D ICs that shutdown gates are cheap thus can be applied at every clock node, shutdown gates in 3D ICs introduce additional control TSVs, which compete with clock TSVs for placement resources. We explore the design methodologies to produce the optimal allocation and placement for clock and control TSVs so that the clock power is minimized. We show that the proposed synthesis flow saves significant clock power while accounting for available TSV placement area. Vertical integration also brings new reliability challenges including TSV's electromigration (EM) and several other reliability loss mechanisms caused by TSV-induced stress. These reliability loss models involve complex inter-dependencies between electrical and thermal conditions, which have not been investigated in the past. In this dissertation we set up an electrical/thermal/reliability co-simulation framework to capture the transient of reliability loss in 3D ICs. We further derive and validate an analytical reliability objective function that can be integrated into the 3D placement design flow. The reliability aware placement scheme enables co-design and co-optimization of both the electrical and reliability property, thus improves both the circuit's performance and its lifetime. Our electrical/reliability co-design scheme avoids unnecessary design cycles or application of ad-hoc fixes that lead to sub-optimal performance. Vertical integration also enables stacking DRAM on top of CPU, providing high bandwidth and short latency. However, non-uniform voltage fluctuation and local thermal hotspot in CPU layers are coupled into DRAM layers, causing a non-uniform bit-cell leakage (thereby bit flip) distribution. We propose a performance-power-resilience simulation framework to capture DRAM soft error in 3D multi-core CPU systems. In addition, a dynamic resilience management (DRM) scheme is investigated, which adaptively tunes CPU's operating points to adjust DRAM's voltage noise and thermal condition during runtime. The DRM uses dynamic frequency scaling to achieve a resilience borrow-in strategy, which effectively enhances DRAM's resilience without sacrificing performance. The proposed physical design methodologies should act as important building blocks for 3D ICs and push 3D ICs toward mainstream acceptance in the near future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Metadata that is associated with either an information system or an information object for purposes of description, administration, legal requirements, technical functionality, use and usage, and preservation, plays a critical role in ensuring the creation, management, preservation and use and re-use of trustworthymaterials, including records. Recordkeeping1 metadata, of which one key type is archival description, plays a particularly important role in documenting the reliability and authenticity of records and recordkeeping systemsas well as the various contexts (legal-administrative, provenancial, procedural, documentary, and technical) within which records are created and kept as they move across space and time. In the digital environment, metadata is also the means by which it is possible to identify how record components – those constituent aspects of a digital record that may be managed, stored and used separately by the creator or the preserver – can be reassembled to generate an authentic copy of a record or reformulated per a user’s request as a customized output package.Issues relating to the creation, capture, management and preservation of adequate metadata are, therefore, integral to any research study addressing the reliability and authenticity of digital entities, regardless of the community, sector or institution within which they are being created. The InterPARES 2 Description Cross-Domain Group (DCD) examined the conceptualization, definitions, roles, and current functionality of metadata and archival description in terms of requirements generated by InterPARES 12. Because of the needs to communicate the work of InterPARES in a meaningful way across not only other disciplines, but also different archival traditions; to interface with, evaluate and inform existing standards, practices and other research projects; and to ensure interoperability across the three focus areas of InterPARES2, the Description Cross-Domain also addressed its research goals with reference to wider thinking about and developments in recordkeeping and metadata. InterPARES2 addressed not only records, however, but a range of digital information objects (referred to as “entities” by InterPARES 2, but not to be confused with the term “entities” as used in metadata and database applications) that are the products and by-products of government, scientific and artistic activities that are carried out using dynamic, interactive or experiential digital systems. The nature of these entities was determined through a diplomatic analysis undertaken as part of extensive case studies of digital systems that were conducted by the InterPARES 2 Focus Groups. This diplomatic analysis established whether the entities identified during the case studies were records, non-records that nevertheless raised important concerns relating to reliability and authenticity, or “potential records.” To be determined to be records, the entities had to meet the criteria outlined by archival theory – they had to have a fixed documentary format and stable content. It was not sufficient that they be considered to be or treated as records by the creator. “Potential records” is a new construct that indicates that a digital system has the potential to create records upon demand, but does not actually fix and set aside records in the normal course of business. The work of the Description Cross-Domain Group, therefore, addresses the metadata needs for all three categories of entities.Finally, since “metadata” as a term is used today so ubiquitously and in so many different ways by different communities, that it is in peril of losing any specificity, part of the work of the DCD sought to name and type categories of metadata. It also addressed incentives for creators to generate appropriate metadata, as well as issues associated with the retention, maintenance and eventual disposition of the metadata that aggregates around digital entities over time.