966 resultados para Uniform dichotomy


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: To explore detainees and staff's attitudes towards tobacco use, in order to assist prison administrators to develop an ethically acceptable tobacco control policy based on stakeholders' opinion. DESIGN: Qualitative study based on in-depth semi-structured interviews with 31 prisoners and 27 staff prior (T1) and after the implementation (T2) of a new smoke-free regulation (2009) in a Swiss male post-trial prison consisting of 120 detainees and 120 employees. RESULTS: At T1, smoking was allowed in common indoor rooms and most working places. Both groups of participants expressed the need for a more uniform and stricter regulation, with general opposition towards a total smoking ban. Expressed fears and difficulties regarding a stricter regulation were increased stress on detainees and strain on staff, violence, riots, loss of control on detainees, and changes in social life. At T2, participants expressed predominantly satisfaction. They reported reduction in their own tobacco use and a better protection against second-hand smoke. However, enforcement was incomplete. The debate was felt as being concentrated on regulation only, leaving aside the subject of tobacco reduction or cessation support. CONCLUSION: Besides an appropriate smoke-free regulation, further developments are necessary in order to have a comprehensive tobacco control policy in prisons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Error-correcting codes and matroids have been widely used in the study of ordinary secret sharing schemes. In this paper, the connections between codes, matroids, and a special class of secret sharing schemes, namely, multiplicative linear secret sharing schemes (LSSSs), are studied. Such schemes are known to enable multiparty computation protocols secure against general (nonthreshold) adversaries.Two open problems related to the complexity of multiplicative LSSSs are considered in this paper. The first one deals with strongly multiplicative LSSSs. As opposed to the case of multiplicative LSSSs, it is not known whether there is an efficient method to transform an LSSS into a strongly multiplicative LSSS for the same access structure with a polynomial increase of the complexity. A property of strongly multiplicative LSSSs that could be useful in solving this problem is proved. Namely, using a suitable generalization of the well-known Berlekamp–Welch decoder, it is shown that all strongly multiplicative LSSSs enable efficient reconstruction of a shared secret in the presence of malicious faults. The second one is to characterize the access structures of ideal multiplicative LSSSs. Specifically, the considered open problem is to determine whether all self-dual vector space access structures are in this situation. By the aforementioned connection, this in fact constitutes an open problem about matroid theory, since it can be restated in terms of representability of identically self-dual matroids by self-dual codes. A new concept is introduced, the flat-partition, that provides a useful classification of identically self-dual matroids. Uniform identically self-dual matroids, which are known to be representable by self-dual codes, form one of the classes. It is proved that this property also holds for the family of matroids that, in a natural way, is the next class in the above classification: the identically self-dual bipartite matroids.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deciding whether two fingerprint marks originate from the same source requires examination and comparison of their features. Many cognitive factors play a major role in such information processing. In this paper we examined the consistency (both between- and within-experts) in the analysis of latent marks, and whether the presence of a 'target' comparison print affects this analysis. Our findings showed that the context of a comparison print affected analysis of the latent mark, possibly influencing allocation of attention, visual search, and threshold for determining a 'signal'. We also found that even without the context of the comparison print there was still a lack of consistency in analysing latent marks. Not only was this reflected by inconsistency between different experts, but the same experts at different times were inconsistent with their own analysis. However, the characterization of these inconsistencies depends on the standard and definition of what constitutes inconsistent. Furthermore, these effects were not uniform; the lack of consistency varied across fingerprints and experts. We propose solutions to mediate variability in the analysis of friction ridge skin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anchoring a flap remains a key procedure in decubital surgery because a flap needs to be stable against shearing forces. This allows an early mobilization and undisturbed primary wound healing. This study evaluated a uniform group of eight paraplegic patients with sacral decubital ulcers and covered the lesions using gluteal rotation flaps with a deepithelialized tip to anchor the flap subcutaneously on the contralateral ischial tuber. Initial wound healing and recurrence after one year were evaluated. All but one flap showed uneventful wound healing, and all the flaps presented without any signs of recurrence or instability. The authors suggest that sufficient anchoring using a deepithelialized part of the flap helps to integrate and stabilize sacral rotation flaps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le travail d'un(e) expert(e) en science forensique exige que ce dernier (cette dernière) prenne une série de décisions. Ces décisions sont difficiles parce qu'elles doivent être prises dans l'inévitable présence d'incertitude, dans le contexte unique des circonstances qui entourent la décision, et, parfois, parce qu'elles sont complexes suite à de nombreuse variables aléatoires et dépendantes les unes des autres. Etant donné que ces décisions peuvent aboutir à des conséquences sérieuses dans l'administration de la justice, la prise de décisions en science forensique devrait être soutenue par un cadre robuste qui fait des inférences en présence d'incertitudes et des décisions sur la base de ces inférences. L'objectif de cette thèse est de répondre à ce besoin en présentant un cadre théorique pour faire des choix rationnels dans des problèmes de décisions rencontrés par les experts dans un laboratoire de science forensique. L'inférence et la théorie de la décision bayésienne satisfont les conditions nécessaires pour un tel cadre théorique. Pour atteindre son objectif, cette thèse consiste de trois propositions, recommandant l'utilisation (1) de la théorie de la décision, (2) des réseaux bayésiens, et (3) des réseaux bayésiens de décision pour gérer des problèmes d'inférence et de décision forensiques. Les résultats présentent un cadre uniforme et cohérent pour faire des inférences et des décisions en science forensique qui utilise les concepts théoriques ci-dessus. Ils décrivent comment organiser chaque type de problème en le décomposant dans ses différents éléments, et comment trouver le meilleur plan d'action en faisant la distinction entre des problèmes de décision en une étape et des problèmes de décision en deux étapes et en y appliquant le principe de la maximisation de l'utilité espérée. Pour illustrer l'application de ce cadre à des problèmes rencontrés par les experts dans un laboratoire de science forensique, des études de cas théoriques appliquent la théorie de la décision, les réseaux bayésiens et les réseaux bayésiens de décision à une sélection de différents types de problèmes d'inférence et de décision impliquant différentes catégories de traces. Deux études du problème des deux traces illustrent comment la construction de réseaux bayésiens permet de gérer des problèmes d'inférence complexes, et ainsi surmonter l'obstacle de la complexité qui peut être présent dans des problèmes de décision. Trois études-une sur ce qu'il faut conclure d'une recherche dans une banque de données qui fournit exactement une correspondance, une sur quel génotype il faut rechercher dans une banque de données sur la base des observations faites sur des résultats de profilage d'ADN, et une sur s'il faut soumettre une trace digitale à un processus qui compare la trace avec des empreintes de sources potentielles-expliquent l'application de la théorie de la décision et des réseaux bayésiens de décision à chacune de ces décisions. Les résultats des études des cas théoriques soutiennent les trois propositions avancées dans cette thèse. Ainsi, cette thèse présente un cadre uniforme pour organiser et trouver le plan d'action le plus rationnel dans des problèmes de décisions rencontrés par les experts dans un laboratoire de science forensique. Le cadre proposé est un outil interactif et exploratoire qui permet de mieux comprendre un problème de décision afin que cette compréhension puisse aboutir à des choix qui sont mieux informés. - Forensic science casework involves making a sériés of choices. The difficulty in making these choices lies in the inévitable presence of uncertainty, the unique context of circumstances surrounding each décision and, in some cases, the complexity due to numerous, interrelated random variables. Given that these décisions can lead to serious conséquences in the admin-istration of justice, forensic décision making should be supported by a robust framework that makes inferences under uncertainty and décisions based on these inferences. The objective of this thesis is to respond to this need by presenting a framework for making rational choices in décision problems encountered by scientists in forensic science laboratories. Bayesian inference and décision theory meets the requirements for such a framework. To attain its objective, this thesis consists of three propositions, advocating the use of (1) décision theory, (2) Bayesian networks, and (3) influence diagrams for handling forensic inference and décision problems. The results present a uniform and coherent framework for making inferences and décisions in forensic science using the above theoretical concepts. They describe how to organize each type of problem by breaking it down into its différent elements, and how to find the most rational course of action by distinguishing between one-stage and two-stage décision problems and applying the principle of expected utility maximization. To illustrate the framework's application to the problems encountered by scientists in forensic science laboratories, theoretical case studies apply décision theory, Bayesian net-works and influence diagrams to a selection of différent types of inference and décision problems dealing with différent catégories of trace evidence. Two studies of the two-trace problem illustrate how the construction of Bayesian networks can handle complex inference problems, and thus overcome the hurdle of complexity that can be present in décision prob-lems. Three studies-one on what to conclude when a database search provides exactly one hit, one on what genotype to search for in a database based on the observations made on DNA typing results, and one on whether to submit a fingermark to the process of comparing it with prints of its potential sources-explain the application of décision theory and influ¬ence diagrams to each of these décisions. The results of the theoretical case studies support the thesis's three propositions. Hence, this thesis présents a uniform framework for organizing and finding the most rational course of action in décision problems encountered by scientists in forensic science laboratories. The proposed framework is an interactive and exploratory tool for better understanding a décision problem so that this understanding may lead to better informed choices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many theoretical dissertations have an unclear definition of diversity and when interpreting strategies of organizational diversity policies, theories often contradict each other. It is argued that this ambiguity and controversy can be diminished by basing theory on diversity and diversity policy more on qualitative structured descriptive empirical comparisons.This argument is elaborated in two steps. First, diversity is shown to be a social construction: dynamic and plural in nature, dependent on the social-historical context. Second, the common theoretical dichotomy between diversity policy as equal opportunities or as diversity management in shown to be possibly misleading; empirical studies indicate more practical differentiation in types of diversity policy, manifested in public and private organizations. As qualitative comparisons are rare, especially in the European context and especially among public organizations, this article calls for more contributions of this kind and provides an analytical framework to assist scholars in the field of diversity studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The constitutive Cauliflower Mosaic Virus 35S promoter (CaMV 35S) is widely used as a tool to express recombinant proteins in plants, but with different success. We previously showed that the expression of an F-actin marker, GFP-talin, in Physcomitrella patens using the CaMV 35S promoter failed to homogenously label moss tissues. Here, we show a significant diminution of the GFP fluorescence in dark grown old moss cells and complete lack of labelling in newly differentiated cells. Furthermore, we demonstrate that stable moss lines harbouring a resistance cassette driven by the CaMV 35S are unable to grow in darkness in the presence of the antibiotic. In contrast to the CaMV 35S, the heat inducible promoter, hsp17.3B showed uniform expression pattern in all cells and tissues following a mild heat shock.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deformation of the Circum-Rhodope Belt Mesozoic (Middle Triassic to earliest Lower Cretaceous) low-grade schists underneath an arc-related ophiolitic magmatic suite and associated sedimentary successions in the eastern Rhodope-Thrace region occurred as a two-episode tectonic process: (i) Late Jurassic deformation of arc to margin units resulting from the eastern Rhodope-Evros arc-Rhodope terrane continental margin collision and accretion to that margin, and (ii) Middle Eocene deformation related to the Tertiary crustal extension and final collision resulting in the closure of the Vardar ocean south of the Rhodope terrane. The first deformational event D-1 is expressed by Late Jurassic NW-N vergent fold generations and the main and subsidiary planar-linear structures. Although overprinting, these structural elements depict uniform bulk north-directed thrust kinematics and are geometrically compatible with the increments of progressive deformation that develops in same greenschist-facies metamorphic grade. It followed the Early-Middle Jurassic magmatic evolution of the eastern Rhodope-Evros arc established on the upper plate of the southward subducting Maliac-Meliata oceanic lithosphere that established the Vardar Ocean in a supra-subduction back-arc setting. This first event resulted in the thrust-related tectonic emplacement of the Mesozoic schists in a supra-crustal level onto the Rhodope continental margin. This Late Jurassic-Early Cretaceous tectonic event related to N-vergent Balkan orogeny is well-constrained by geochronological data and traced at a regional-scale within distinct units of the Carpatho-Balkan Belt. Following subduction reversal towards the north whereby the Vardar Ocean was subducted beneath the Rhodope margin by latest Cretaceous times, the low-grade schists aquired a new position in the upper plate, and hence, the Mesozoic schists are lacking the Cretaceous S-directed tectono-metamorphic episode whose effects are widespread in the underlying high-grade basement. The subduction of the remnant Vardar Ocean located behind the colliding arc since the middle Cretaceous was responsible for its ultimate closure, Early Tertiary collision with the Pelagonian block and extension in the region caused the extensional collapse related to the second deformational event D-2. This extensional episode was experienced passively by the Mesozoic schists located in the hanging wall of the extensional detachments in Eocene times. It resulted in NE-SW oriented open folds representing corrugation antiforms of the extensional detachment surfaces, brittle faulting and burial history beneath thick Eocene sediments as indicated by 42.1-39.7 Ma Ar-40/Ar-39 mica plateau ages obtained in the study. The results provide structural constraints for the involvement components of Jurassic paleo-subduction zone in a Late Jurassic arc-continental margin collisional history that contributed to accretion-related crustal growth of the Rhodope terrane. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heart transplantation (HTx) started in 1987 at two university hospitals (CHUV, HUG) in the western part of Switzerland, with 223 HTx performed at the CHUV until December 2010. Between 1987 and 2003, 106 HTx were realized at the HUG resulting in a total of 329 HTx in the western part of Switzerland. After the relocation of organ transplantation activity in the western part of Switzerland in 2003, the surgical part and the early postoperative care of HTx remained limited to the CHUV. However, every other HTx activity are pursued at the two university hospitals (CHUV, HUG). This article summarizes the actual protocols for selection and pre-transplant follow-up of HTx candidates in the western part of Switzerland, permitting a uniform structure of pretransplant follow-up in the western part of Switzerland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To enhance the clinical value of coronary magnetic resonance angiography (MRA), high-relaxivity contrast agents have recently been used at 3T. Here we examine a uniform bilateral shadowing artifact observed along the coronary arteries in MRA images collected using such a contrast agent. Simulations were performed to characterize this artifact, including its origin, to determine how best to mitigate this effect, and to optimize a data acquisition/injection scheme. An intraluminal contrast agent concentration model was used to simulate various acquisition strategies with two profile orders for a slow-infusion of a high-relaxivity contrast agent. Filtering effects from temporally variable weighting in k-space are prominent when a centric, radial (CR) profile order is applied during contrast infusion, resulting in decreased signal enhancement and underestimation of vessel width, while both pre- and postinfusion steady-state acquisitions result in overestimation of the vessel width. Acquisition during the brief postinfusion steady-state produces the greatest signal enhancement and minimizes k-space filtering artifacts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is commonly regarded that the overuse of traffic control devices desensitizes drivers and leads to disrespect, especially for low-volume secondary roads with limited enforcement. The maintenance of traffic signs is also a tort liability concern, exacerbated by unnecessary signs. The Federal Highway Administration’s (FHWA) Manual on Uniform Traffic Control Devices (MUTCD) and the Institute of Transportation Engineer’s (ITE) Traffic Control Devices Handbook provide guidance for the implementation of STOP signs based on expected compliance with right-of-way rules, provision of through traffic flow, context (proximity to other controlled intersections), speed, sight distance, and crash history. The approach(es) to stop is left to engineering judgment and is usually dependent on traffic volume or functional class/continuity of system. Although presently being considered by the National Committee on Traffic Control Devices, traffic volume itself is not given as a criterion for implementation in the MUTCD. STOP signs have been installed at many locations for various reasons which no longer (or perhaps never) met engineering needs. If in fact the presence of STOP signs does not increase safety, removal should be considered. To date, however, no guidance exists for the removal of STOP signs at two-way stop-controlled intersections. The scope of this research is ultra-low-volume (< 150 daily entering vehicles) unpaved intersections in rural agricultural areas of Iowa, where each of the 99 counties may have as many as 300 or more STOP sign pairs. Overall safety performance is examined as a function of a county excessive use factor, developed specifically for this study and based on various volume ranges and terrain as a proxy for sight distance. Four conclusions are supported: (1) there is no statistical difference in the safety performance of ultra-low-volume stop-controlled and uncontrolled intersections for all drivers or for younger and older drivers (although interestingly, older drivers are underrepresented at both types of intersections); (2) compliance with stop control (as indicated by crash performance) does not appear to be affected by the use or excessive use of STOP signs, even when adjusted for volume and a sight distance proxy; (3) crash performance does not appear to be improved by the liberal use of stop control; (4) safety performance of uncontrolled intersections appears to decline relative to stop-controlled intersections above about 150 daily entering vehicles. Subject to adequate sight distance, traffic professionals may wish to consider removal of control below this threshold. The report concludes with a section on methods and legal considerations for safe removal of stop control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Executive Orders from Governor Hughes. Uniform Criminal Extradition

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Severe environmental conditions, coupled with the routine use of deicing chemicals and increasing traffic volume, tend to place extreme demands on portland cement concrete (PCC) pavements. In most instances, engineers have been able to specify and build PCC pavements that met these challenges. However, there have also been reports of premature deterioration that could not be specifically attributed to a single cause. Modern concrete mixtures have evolved to become very complex chemical systems. The complexity can be attributed to both the number of ingredients used in any given mixture and the various types and sources of the ingredients supplied to any given project. Local environmental conditions can also influence the outcome of paving projects. This research project investigated important variables that impact the homogeneity and rheology of concrete mixtures. The project consisted of a field study and a laboratory study. The field study collected information from six different projects in Iowa. The information that was collected during the field study documented cementitious material properties, plastic concrete properties, and hardened concrete properties. The laboratory study was used to develop baseline mixture variability information for the field study. It also investigated plastic concrete properties using various new devices to evaluate rheology and mixing efficiency. In addition, the lab study evaluated a strategy for the optimization of mortar and concrete mixtures containing supplementary cementitious materials. The results of the field studies indicated that the quality management concrete (QMC) mixtures being placed in the state generally exhibited good uniformity and good to excellent workability. Hardened concrete properties (compressive strength and hardened air content) were also satisfactory. The uniformity of the raw cementitious materials that were used on the projects could not be monitored as closely as was desired by the investigators; however, the information that was gathered indicated that the bulk chemical composition of most materials streams was reasonably uniform. Specific minerals phases in the cementitious materials were less uniform than the bulk chemical composition. The results of the laboratory study indicated that ternary mixtures show significant promise for improving the performance of concrete mixtures. The lab study also verified the results from prior projects that have indicated that bassanite is typically the major sulfate phase that is present in Iowa cements. This causes the cements to exhibit premature stiffening problems (false set) in laboratory testing. Fly ash helps to reduce the impact of premature stiffening because it behaves like a low-range water reducer in most instances. The premature stiffening problem can also be alleviated by increasing the water–cement ratio of the mixture and providing a remix cycle for the mixture.