889 resultados para shape vs use
Resumo:
We have modified a technique which uses a single pair of primer sets directed against homologous but distinct genes on the X and Y chromosomes, all of which are coamplified in the same reaction tube with trace amounts of radioactivity. The resulting bands are equal in length, yet distinguishable by restriction enzyme sites generating two independent bands, a 364 bp X-specific band and a 280 bp Y-specific band. A standard curve was generated to show the linear relationship between X/Y ratio average vs. %Y or %X chromosomal content. Of the 51 purified amniocyte DNA samples analyzed, 16 samples showed evidence of high % X contamination while 2 samples demonstrated higher % Y than the expected 50% X and 50% Y chromosomal content. With regards to the 25 processed sperm samples analyzed, X-sperm enrichment was evident when compared to the primary sex ratio whereas Y-sperm was enriched when we compared before and after selection samples.
Resumo:
I thank the authors of previous studies on global variation in insect thermal tolerances who have generously provided open access use of their data sets.
Resumo:
Acknowledgements K. Ashbrook, M. Barrueto, K. Elner, A. Hargreaves, S. Jacobs, G. Lancton, M. LeVaillant, E. Grosbellet, A. Moody, A. Ronston, J. Provencher, P. Smith, K. Woo and P. Woodward helped in the field. J. Nakoolak kept us safe from bears. N. Sapir and two anonymous reviewers provided very useful comments on an earlier version of our manuscript. R. Armstrong at the Nunavut Research Institute, M. Mallory at the Canadian Wildlife Service Northern Research Division and C. Eberl at National Wildlife Research Centre in Ottawa provided logistical support. F. Crenner, N. Chatelain and M. Brucker customized the GPS at the IPHC-CNRS. KHE received financial support through a NSERC Vanier Canada Graduate Scholarship, ACUNS Garfield Weston Northern Studies scholarship and AINA Jennifer Robinson Scholarship and JFH received NSERC Discovery Grant funding. J. Welcker generously loaned some accelerometers. All procedures were approved under the guidelines of the Canadian Council for Animal Care.
Resumo:
X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.
A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.
Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.
The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).
First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.
Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.
Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.
The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.
To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.
The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.
The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.
Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.
The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.
In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.
Resumo:
BACKGROUND: Attention-deficit/hyperactivity disorder (ADHD) is a risk factor for problematic cannabis use. However, clinical and anecdotal evidence suggest an increasingly popular perception that cannabis is therapeutic for ADHD, including via online resources. Given that the Internet is increasingly utilized as a source of healthcare information and may influence perceptions, we conducted a qualitative analysis of online forum discussions, also referred to as threads, on the effects of cannabis on ADHD to systematically characterize the content patients and caregivers may encounter about ADHD and cannabis. METHODS: A total of 268 separate forum threads were identified. Twenty percent (20%) were randomly selected, which yielded 55 separate forum threads (mean number of individual posts per forum thread = 17.53) scored by three raters (Cohen's kappa = 0.74). A final sample of 401 posts in these forum threads received at least one endorsement on predetermined topics following qualitative coding procedures. RESULTS: Twenty-five (25%) percent of individual posts indicated that cannabis is therapeutic for ADHD, as opposed to 8% that it is harmful, 5% that it is both therapeutic and harmful, and 2% that it has no effect on ADHD. This pattern was generally consistent when the year of each post was considered. The greater endorsement of therapeutic versus harmful effects of cannabis did not generalize to mood, other (non-ADHD) psychiatric conditions, or overall domains of daily life. Additional themes emerged (e.g., cannabis being considered sanctioned by healthcare providers). CONCLUSIONS: Despite that there are no clinical recommendations or systematic research supporting the beneficial effects of cannabis use for ADHD, online discussions indicate that cannabis is considered therapeutic for ADHD-this is the first study to identify such a trend. This type of online information could shape ADHD patient and caregiver perceptions, and influence cannabis use and clinical care.
Resumo:
Cette thèse présente une théorie de la fonction formelle et de la structure des phrases dans la musique contemporaine, théorie qui peut être utilisée aussi bien comme outil analytique que pour créer de nouvelles œuvres. Deux concepts théoriques actuels aident à clarifier la structure des phrases : les projections temporelles de Christopher Hasty et la théorie des fonctions formelles de William Caplin, qui inclut le concept de l’organisation formelle soudée versus lâche (tight-knit vs. loose). Les projections temporelles sont perceptibles grâce à l’accent mis sur les paramètres secondaires, comme le style du jeu, l’articulation et le timbre. Des sections avec une organisation formelle soudée ont des projections temporelles claires, qui sont créées par la juxtaposition des motifs distincts, généralement sous la forme d'une idée de base en deux parties. Ces projections organisent la musique en phrases de présentation, en phrases de continuité et finalement, à des moments formels charnières, en phrases cadentielles. Les sections pourvues d’une organisation plus lâche tendent à présenter des projections et mouvements harmoniques moins clairs et moins d’uniformité motivique. La structure des phrases de trois pièces tardives pour instrument soliste de Pierre Boulez est analysée : Anthèmes I pour violon (1991-1992) et deux pièces pour piano, Incises (2001) et une page d’éphéméride (2005). Les idées proposées dans le présent document font suite à une analyse de ces œuvres et ont eu une forte influence sur mes propres compositions, en particulier Lucretia Overture pour orchestre et 4 Impromptus pour flûte, saxophone soprano et piano, qui sont également analysés en détail. Plusieurs techniques de composition supplémentaires peuvent être discernés dans ces deux œuvres, y compris l'utilisation de séquence mélodiques pour contrôler le rythme harmonique; des passages composés de plusieurs couches musicales chacun avec un structure de phrase distinct; et le relâchement de l'organisation formelle de matériels récurrents. Enfin, la composition de plusieurs autres travaux antérieurs a donné lieu à des techniques utilisées dans ces deux œuvres et ils sont brièvement abordés dans la section finale.
Resumo:
Youth sport coaches shape the developmental sporting experience for their athletes (Camiré, Trudel, & Forneris, 2014). Specifically, coaches who form individualized, supportive relationships with their athletes can increase the development of personal and social skills (Fraser-Thomas, Côté, & Deakin, 2005). In light of the value of these relationships, increasing evidence is prompting the application of leadership theories, such as Transformational Leadership (TFL), in youth sport (Vella et al., 2013). The aim of this study was to explore coach perceptions of how and why leadership behaviours are applied in the youth sport context. Eleven coaches (Mage= 42.3, SD= 15.2) were recruited from competitive youth soccer and volleyball clubs (athletes’ Mage= 15.8, SD= 1.9) in Eastern Ontario and participated in a stimulated recall interview. During the interviews, coaches reflected upon their own coaching behaviours and provided insight into the application of leadership behaviours in youth sport. Responses were prompted by relevant video sequences from recorded practice and game sessions. A thematic content analysis revealed that; i) coaches use a variety of leadership behaviours in youth sport, ii) the use of leadership behaviours vary across sport contexts or settings, and iii) contrasting leadership styles (e.g., transactional vs. transformational) are associated with distinctive coach objectives (e.g., promoting confidence vs. establishing respect). These findings have helped identify gaps within coach education, and provide theoretical insight for applying leadership theories, and more specifically TFL, to help improve the sport experiences of young athletes.
Resumo:
Shape-based registration methods frequently encounters in the domains of computer vision, image processing and medical imaging. The registration problem is to find an optimal transformation/mapping between sets of rigid or nonrigid objects and to automatically solve for correspondences. In this paper we present a comparison of two different probabilistic methods, the entropy and the growing neural gas network (GNG), as general feature-based registration algorithms. Using entropy shape modelling is performed by connecting the point sets with the highest probability of curvature information, while with GNG the points sets are connected using nearest-neighbour relationships derived from competitive hebbian learning. In order to compare performances we use different levels of shape deformation starting with a simple shape 2D MRI brain ventricles and moving to more complicated shapes like hands. Results both quantitatively and qualitatively are given for both sets.
Resumo:
Calcitic belemnite rostra are usually employed to perform paleoenvironmental studies based on geochemical data. However, several questions, such as their original porosity and microstructure, remain open, despite they are essential to make accurate interpretations based on geochemical analyses.This paper revisits and enlightens some of these questions. Petrographic data demonstrate that calcite crystals of the rostrum solidum of belemnites grow from spherulites that successively develop along the apical line, resulting in a “regular spherulithic prismatic” microstructure. Radially arranged calcite crystals emerge and diverge from the spherulites: towards the apex, crystals grow until a new spherulite is formed; towards the external walls of the rostrum, the crystals become progressively bigger and prismatic. Adjacent crystals slightly vary in their c-axis orientation, resulting in undulose extinction. Concentric growth layering develops at different scales and is superimposed and traversed by a radial pattern, which results in the micro-fibrous texture that is observed in the calcite crystals in the rostra.Petrographic data demonstrate that single calcite crystals in the rostra have a composite nature, which strongly suggests that the belemnite rostra were originally porous. Single crystals consistently comprise two distinct zones or sectors in optical continuity: 1) the inner zone is fluorescent, has relatively low optical relief under transmitted light (TL) microscopy, a dark-grey color under backscatter electron microscopy (BSEM), a commonly triangular shape, a “patchy” appearance and relatively high Mg and Na contents; 2) the outer sector is non-fluorescent, has relatively high optical relief under TL, a light-grey color under BSEM and low Mg and Na contents. The inner and fluorescent sectors are interpreted to have formed first as a product of biologically controlled mineralization during belemnite skeletal growth and the non-fluorescent outer sectors as overgrowths of the former, filling the intra- and inter-crystalline porosity. This question has important implications for making paleoenvironmental and/or paleoclimatic interpretations based on geochemical analyses of belemnite rostra.Finally, the petrographic features of composite calcite crystals in the rostra also suggest the non-classical crystallization of belemnite rostra, as previously suggested by other authors.
Resumo:
The TTIP is a proposal on negotiations between the EU and the USA in order to create the largest free international trade area by extension, population and volume of trade of all existing ones. In our view, TTIP would be the geoeconomic answer to BRICS (Brazil, Russia, India, China and South Africa), as a comercial, geopolitical and cooperation space in other areas such as the military, in both that TTIP reproduce on a commercial scale the political and military alliance already existing between good part of the EU and USA by the NATO. In this paper we will try to explain why the possible rivalry between TTIP and BRICS would reproduce in the XXIst. Century the schemes of “Cold War” inherited from XXth. Century, that in turn reproduced the geopolitical confrontations arising from the theory of Haltford McKinder pivot area and the traditional opposition between thalassocratic imperialisms (government on the seas and oceans) and tellurocratic imperialisms (government on an enormous portion of emerged land). Likewise, we will try to show why, at a dialectic of States level, the most populated, territorially extensive and with greater amount of resources political societies will be those that have the greatest ability to impose a particular model of international relations and its geopolitical hegemony on a universal scale in response to this viable confrontation between TTIP, plus TTP, vs. BRICS.
Resumo:
Este trabajo pretende explorar la dimensión ritual en los Textos de las Pirámides, el corpus de literatura religiosa extensa más antiguo de la humanidad. La naturaleza variada de sus componentes textuales ha impedido que los egiptólogos comprendan en profundidad las complejidades de la colección y los contextos originales en los que estos textos (ritos) aparecieron. La aplicación de la teoría del ritual, principalmente la aproximación de la sintaxis ritual, ofrece a los investigadores un marco excelente de análisis e interpretación del corpus, su estructura y función. Sujeto a las reglas de la sintaxis ritual es posible exponer los múltiples niveles de significado en el corpus para la resurrección y salvación del difunto.
Resumo:
El objetivo de este trabajo es mostrar los fenómenos en las traducciones de Las baladas del ajo, de texto original chino traducido primero al inglés, y del inglés al español. Asimismo, tenemos dos objetivos específicos, por un lado estudiamos las fórmulas de traducción de los elementos culturales –muy frecuentes– detectados en Las baladas del ajo y por el otro, observamos la traducción indirecta del inglés al español a través de una comparación entre el texto original en chino, la versión en inglés y la española. Las conclusiones de este trabajo nos permiten estudiar tanto las soluciones adoptadas para los elementos culturales, así como hacer una evaluación general de la traducción de segunda mano.
Resumo:
The year 1977 saw the making of the first Latino superhero by a Latino artist. From the 1980s onwards it is also possible to find Latina super-heroines, whose number and complexity has kept increasing ever since. Yet, the representations of spandexed Latinas are still few. For that reason, the goal of this paper is, firstly, to gather a great number of Latina super-heroines and, secondly, to analyze the role that they have played in the history of American literature and art. More specifically, it aims at comparing the spandexed Latinas created by non-Latino/a artists and mainstream comic enterprises with the Latina super-heroines devised by Latino/a artists. The conclusion is that whereas the former tend to conceive heroines within the constraints of the logic of Girl Power, the latter choose to imbue their works with a more daring political content and to align their heroines with the ideologies of Feminism and Postcolonialism.
Resumo:
Adjoint methods have proven to be an efficient way of calculating the gradient of an objective function with respect to a shape parameter for optimisation, with a computational cost nearly independent of the number of the design variables [1]. The approach in this paper links the adjoint surface sensitivities (gradient of objective function with respect to the surface movement) with the parametric design velocities (movement of the surface due to a CAD parameter perturbation) in order to compute the gradient of the objective function with respect to CAD variables.
For a successful implementation of shape optimization strategies in practical industrial cases, the choice of design variables or parameterisation scheme used for the model to be optimized plays a vital role. Where the goal is to base the optimization on a CAD model the choices are to use a NURBS geometry generated from CAD modelling software, where the position of the NURBS control points are the optimisation variables [2] or to use the feature based CAD model with all of the construction history to preserve the design intent [3]. The main advantage of using the feature based model is that the optimized model produced can be directly used for the downstream applications including manufacturing and process planning.
This paper presents an approach for optimization based on the feature based CAD model, which uses CAD parameters defining the features in the model geometry as the design variables. In order to capture the CAD surface movement with respect to the change in design variable, the “Parametric Design Velocity” is calculated, which is defined as the movement of the CAD model boundary in the normal direction due to a change in the parameter value.
The approach presented here for calculating the design velocities represents an advancement in terms of capability and robustness of that described by Robinson et al. [3]. The process can be easily integrated to most industrial optimisation workflows and is immune to the topology and labelling issues highlighted by other CAD based optimisation processes. It considers every continuous (“real value”) parameter type as an optimisation variable, and it can be adapted to work with any CAD modelling software, as long as it has an API which provides access to the values of the parameters which control the model shape and allows the model geometry to be exported. To calculate the movement of the boundary the methodology employs finite differences on the shape of the 3D CAD models before and after the parameter perturbation. The implementation procedure includes calculating the geometrical movement along a normal direction between two discrete representations of the original and perturbed geometry respectively. Parametric design velocities can then be directly linked with adjoint surface sensitivities to extract the gradients to use in a gradient-based optimization algorithm.
The optimisation of a flow optimisation problem is presented, in which the power dissipation of the flow in an automotive air duct is to be reduced by changing the parameters of the CAD geometry created in CATIA V5. The flow sensitivities are computed with the continuous adjoint method for a laminar and turbulent flow [4] and are combined with the parametric design velocities to compute the cost function gradients. A line-search algorithm is then used to update the design variables and proceed further with optimisation process.
Resumo:
BACKGROUND: Calcium channel blockers (CCBs) may affect prostate cancer (PCa) growth by various mechanisms including those related to androgens. The fusion of the androgen-regulated gene TMPRSS2 and the oncogene ERG (TMPRSS2:ERG or T2E) is common in PCa, and prostate tumors that harbor the gene fusion are believed to represent a distinct disease subtype. We studied the association of CCB use with the risk of PCa, and molecular subtypes of PCa defined by T2E status.
METHODS: Participants were residents of King County, Washington, recruited for population-based case-control studies (1993-1996 or 2002-2005). Tumor T2E status was determined by fluorescence in situ hybridization using tumor tissue specimens from radical prostatectomy. Detailed information on use of CCBs and other variables was obtained through in-person interviews. Binomial and polytomous logistic regression were used to generate odds ratios (ORs) and 95% confidence intervals (CIs).
RESULTS: The study included 1,747 PCa patients and 1,635 age-matched controls. A subset of 563 patients treated with radical prostatectomy had T2E status determined, of which 295 were T2E positive (52%). Use of CCBs (ever vs. never) was not associated with overall PCa risk. However, among European-American men, users had a reduced risk of higher-grade PCa (Gleason scores ≥7: adjusted OR = 0.64; 95% CI: 0.44-0.95). Further, use of CCBs was associated with a reduced risk of T2E positive PCa (adjusted OR = 0.38; 95% CI: 0.19-0.78), but was not associated with T2E negative PCa.
CONCLUSIONS: This study found suggestive evidence that use of CCBs is associated with reduced relative risks for higher Gleason score and T2E positive PCa. Future studies of PCa etiology should consider etiologic heterogeneity as PCa subtypes may develop through different causal pathways.