933 resultados para Incommensurability of values
Resumo:
The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable approach to risk estimation would be to use organ-specific non-linear risk models applied to the dose distributions of organs within or near the treatment fields (lungs and contralateral breast in the case of breast radiotherapy) as the majority of radiation-induced secondary cancers are found in the beam-bordering regions.
Resumo:
The Jalta and Jebel Ghozlane ore deposits are located in the extreme North of Tunisia, within the Nappe zone. The mineralization of Jalta, hosted in Triassic dolostones and the overlying Mio-Pliocene conglomerates, consists of abundant galena, barite, and cerussite with accessory sphalerite, pyrite, and jordanite. At Jebel Ghozlane, large Pb-Zn concentrations occur in the Triassic dolostones and Eocene limestones. The mineral association consists of galena, sphalerite, barite, and celestite and their oxidation products (cerussite, smithsonite, and anglesite). Lead isotope ratios in galena from both districts are relatively homogeneous ((206)Pb/(204)Pb = 18.702-18.823, (207)Pb/(204)Pb = 15.665-15.677, (208)Pb/(204)Pb = 38.725-38.875). The delta(34)S values for sulfates from both areas (+12.2 to +16.2 parts per thousand at Jalta and + 14.3 to + 19.4 parts per thousand at Jebel Ghozlane) are compatible with a derivation of sulfur from marine sulfates, possibly sourced from the Triassic evaporites. The delta(34)S values of the sulfides have a range between -10 and +12.5 parts per thousand at Jalta, and between -9.1 and +22.1 parts per thousand at Jebel Ghozlane. The large range of values suggests reduction of the sulfate by bacterial and/or thermochemical reduction of sulfate to sulfur. The high delta(34)S values of sulfides require closed-system reduction processes. The isotopically light carbon in late calcites (-6.3 to -2.5 parts per thousand) and authigenic dolomite (-17.6 parts per thousand) suggests an organic source of at least some of the carbon in these samples, whereas the similarity of the delta(18)O values between calcite (+24.8 parts per thousand) and the authigenic dolomite (+24.7 parts per thousand) of Jalta and their respective host rocks reflects oxygen isotope buffering of the mineralizing fluids by the host rock carbonates. The secondary calcite isotope compositions of Jalta are compatible with a hydrothermal fluid circulation at approximately 100 to 200 degrees C, but temperatures as low as 50 degrees C may be indicated by the late calcite of Jebel Ghozlane (delta(18)O of +35.9 parts per thousand). Given the geological events related to the Alpine orogeny in the Nappe zone (nappe emplacement, bimodal volcanism, and reactivation of major faults, such as Ghardimaou-Cap Serrat) and the Neogene age of the host rocks in several localities, a Late-Miocene age is proposed for the Pb-Zn ore deposits considered in this study. Remobilization of deep-seated primary deposits in the Paleozoic sequence is the most probable source for metals in both localities considered in this study and probably in the Nappe zone as a whole. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
We use interplanetary transport simulations to compute a database of electron Green's functions, i.e., differential intensities resulting at the spacecraft position from an impulsive injection of energetic (>20 keV) electrons close to the Sun, for a large number of values of two standard interplanetary transport parameters: the scattering mean free path and the solar wind speed. The nominal energy channels of the ACE, STEREO, and Wind spacecraft have been used in the interplanetary transport simulations to conceive a unique tool for the study of near-relativistic electron events observed at 1 AU. In this paper, we quantify the characteristic times of the Green's functions (onset and peak time, rise and decay phase duration) as a function of the interplanetary transport conditions. We use the database to calculate the FWHM of the pitch-angle distributions at different times of the event and under different scattering conditions. This allows us to provide a first quantitative result that can be compared with observations, and to assess the validity of the frequently used term beam-like pitch-angle distribution.
Resumo:
We analyze the neutron skin thickness in finite nuclei with the droplet model and effective nuclear interactions. The ratio of the bulk symmetry energy J to the so-called surface stiffness coefficient Q has in the droplet model a prominent role in driving the size of neutron skins. We present a correlation between the density derivative of the nuclear symmetry energy at saturation and the J/Q ratio. We emphasize the role of the surface widths of the neutron and proton density profiles in the calculation of the neutron skin thickness when one uses realistic mean-field effective interactions. Next, taking as experimental baseline the neutron skin sizes measured in 26 antiprotonic atoms along the mass table, we explore constraints arising from neutron skins on the value of the J/Q ratio. The results favor a relatively soft symmetry energy at subsaturation densities. Our predictions are compared with the recent constraints derived from other experimental observables. Though the various extractions predict different ranges of values, one finds a narrow window L∼45-75 MeV for the coefficient L that characterizes the density derivative of the symmetry energy that is compatible with all the different empirical indications.
Resumo:
Nuclear DNA content in gametophytes and sporophytes or the prostrate phases of the following species of Bonnemaisoniaceae (Asparagopsis armata, Asparagopsis taxiformis, Bonnemaisonia asparagoides, Bonnemaisonia clavata and Bonnemaisonia hamifera) were estimated by image analysis and static microspectrophotometry using the DNA-localizing fluorochrome DAPI (4′, 6-diamidino-2-phenylindole, dilactate) and the chicken erythrocytes standard. These estimates expand on the Kew database of DNA nuclear content. DNA content values for 1C nuclei in the gametophytes (spermatia and vegetative cells) range from 0.5 pg to 0.8 pg, and for 2C nuclei in the sporophytes or the prostrate phases range from 1.15-1.7 pg. Although only the 2C and 4C values were observed in the sporophyte or the prostrate phase, in the vegetative cells of the gametophyte the values oscillated from 1C to 4C, showing the possible start of endopolyploidy. The results confirm the alternation of nuclear phases in these Bonnemaisoniaceae species, in those that have tetrasporogenesis, as well as those that have somatic meiosis. The availability of a consensus phylogenetic tree for Bonnemaisoniaceae has opened the way to determine evolutionary trends in DNA contents. Both the estimated genome sizes and the published chromosome numbers for Bonnemaisoniaceae suggest a narrow range of values consistent with the conservation of an ancestral genome.
Resumo:
[eng] We propose two generalizations of the Banzhaf value for partition function form games. In both cases, our approach is based on probability distributions over the set of possible coalition structures that may arise for any given set of agents. First, we introduce a family of values, one for each collection of the latter probability distributions, defined as the Banzhaf value of an expected coalitional game. Then, we provide two characterization results for this new family of values within the framework of all partition function games. Both results rely on a property of neutrality with respect to amalgamation of players. Second, as this collusion transformation fails to be meaningful for simple games in partition function form, we propose another generalization of the Banzhaf value which also builds on probability distributions of the above type. This latter family is characterized by means of a neutrality property which uses an amalgamation transformation of players for which simple games are closed.
Resumo:
[eng] We propose two generalizations of the Banzhaf value for partition function form games. In both cases, our approach is based on probability distributions over the set of possible coalition structures that may arise for any given set of agents. First, we introduce a family of values, one for each collection of the latter probability distributions, defined as the Banzhaf value of an expected coalitional game. Then, we provide two characterization results for this new family of values within the framework of all partition function games. Both results rely on a property of neutrality with respect to amalgamation of players. Second, as this collusion transformation fails to be meaningful for simple games in partition function form, we propose another generalization of the Banzhaf value which also builds on probability distributions of the above type. This latter family is characterized by means of a neutrality property which uses an amalgamation transformation of players for which simple games are closed.
Resumo:
Short-term synaptic depression (STD) is a form of synaptic plasticity that has a large impact on network computations. Experimental results suggest that STD is modulated by cortical activity, decreasing with activity in the network and increasing during silent states. Here, we explored different activity-modulation protocols in a biophysical network model for which the model displayed less STD when the network was active than when it was silent, in agreement with experimental results. Furthermore, we studied how trains of synaptic potentials had lesser decay during periods of activity (UP states) than during silent periods (DOWN states), providing new experimental predictions. We next tackled the inverse question of what is the impact of modifying STD parameters on the emergent activity of the network, a question difficult to answer experimentally. We found that synaptic depression of cortical connections had a critical role to determine the regime of rhythmic cortical activity. While low STD resulted in an emergent rhythmic activity with short UP states and long DOWN states, increasing STD resulted in longer and more frequent UP states interleaved with short silent periods. A still higher synaptic depression set the network into a non-oscillatory firing regime where DOWN states no longer occurred. The speed of propagation of UP states along the network was not found to be modulated by STD during the oscillatory regime; it remained relatively stable over a range of values of STD. Overall, we found that the mutual interactions between synaptic depression and ongoing network activity are critical to determine the mechanisms that modulate cortical emergent patterns.
Resumo:
PROBLEM: Truth-telling is an important component of respect for patients' self-determination, but in the context of breaking bad news, it is also a distressing and difficult task. INTERVENTION: We investigated the long-term influence of a simulated patient-based teaching intervention, integrating learning objectives in communication skills and ethics into students' attitudes and concerns regarding truth-telling. We followed two cohorts of medical students from the preclinical third year to their clinical rotations (fifth year). Open-ended responses were analysed to explore medical students' reported difficulties in breaking bad news. CONTEXT: This intervention was implemented during the last preclinical year of a problem-based medical curriculum, in collaboration between the doctor-patient communication and ethics programs. OUTCOME: Over time, concerns such as empathy and truthfulness shifted from a personal to a relational focus. Whereas 'truthfulness' was a concern for the content of the message, 'truth-telling' included concerns on how information was communicated and how realistically it was received. Truth-telling required empathy, adaptation to the patient, and appropriate management of emotions, both for the patient's welfare and for a realistic understanding of the situation. LESSONS LEARNED: Our study confirms that an intervention confronting students with a realistic situation succeeds in making them more aware of the real issues of truth-telling. Medical students deepened their reflection over time, acquiring a deeper understanding of the relational dimension of values such as truth-telling, and honing their view of empathy.
Resumo:
A thorough literature review about the current situation on the implementation of eye lens monitoring has been performed in order to provide recommendations regarding dosemeter types, calibration procedures and practical aspects of eye lens monitoring for interventional radiology personnel. Most relevant data and recommendations from about 100 papers have been analysed and classified in the following topics: challenges of today in eye lens monitoring; conversion coefficients, phantoms and calibration procedures for eye lens dose evaluation; correction factors and dosemeters for eye lens dose measurements; dosemeter position and influence of protective devices. The major findings of the review can be summarised as follows: the recommended operational quantity for the eye lens monitoring is H p (3). At present, several dosemeters are available for eye lens monitoring and calibration procedures are being developed. However, in practice, very often, alternative methods are used to assess the dose to the eye lens. A summary of correction factors found in the literature for the assessment of the eye lens dose is provided. These factors can give an estimation of the eye lens dose when alternative methods, such as the use of a whole body dosemeter, are used. A wide range of values is found, thus indicating the large uncertainty associated with these simplified methods. Reduction factors from most common protective devices obtained experimentally and using Monte Carlo calculations are presented. The paper concludes that the use of a dosemeter placed at collar level outside the lead apron can provide a useful first estimate of the eye lens exposure. However, for workplaces with estimated annual equivalent dose to the eye lens close to the dose limit, specific eye lens monitoring should be performed. Finally, training of the involved medical staff on the risks of ionising radiation for the eye lens and on the correct use of protective systems is strongly recommended.
Resumo:
The use of the Bayes factor (BF) or likelihood ratio as a metric to assess the probative value of forensic traces is largely supported by operational standards and recommendations in different forensic disciplines. However, the progress towards more widespread consensus about foundational principles is still fragile as it raises new problems about which views differ. It is not uncommon e.g. to encounter scientists who feel the need to compute the probability distribution of a given expression of evidential value (i.e. a BF), or to place intervals or significance probabilities on such a quantity. The article here presents arguments to show that such views involve a misconception of principles and abuse of language. The conclusion of the discussion is that, in a given case at hand, forensic scientists ought to offer to a court of justice a given single value for the BF, rather than an expression based on a distribution over a range of values.
Resumo:
The aim of this study is to analyse the content of the interdisciplinary conversations in Göttingen between 1949 and 1961. The task is to compare models for describing reality presented by quantum physicists and theologians. Descriptions of reality indifferent disciplines are conditioned by the development of the concept of reality in philosophy, physics and theology. Our basic problem is stated in the question: How is it possible for the intramental image to match the external object?Cartesian knowledge presupposes clear and distinct ideas in the mind prior to observation resulting in a true correspondence between the observed object and the cogitative observing subject. The Kantian synthesis between rationalism and empiricism emphasises an extended character of representation. The human mind is not a passive receiver of external information, but is actively construing intramental representations of external reality in the epistemological process. Heidegger's aim was to reach a more primordial mode of understanding reality than what is possible in the Cartesian Subject-Object distinction. In Heidegger's philosophy, ontology as being-in-the-world is prior to knowledge concerning being. Ontology can be grasped only in the totality of being (Dasein), not only as an object of reflection and perception. According to Bohr, quantum mechanics introduces an irreducible loss in representation, which classically understood is a deficiency in knowledge. The conflicting aspects (particle and wave pictures) in our comprehension of physical reality, cannot be completely accommodated into an entire and coherent model of reality. What Bohr rejects is not realism, but the classical Einsteinian version of it. By the use of complementary descriptions, Bohr tries to save a fundamentally realistic position. The fundamental question in Barthian theology is the problem of God as an object of theological discourse. Dialectics is Barth¿s way to express knowledge of God avoiding a speculative theology and a human-centred religious self-consciousness. In Barthian theology, the human capacity for knowledge, independently of revelation, is insufficient to comprehend the being of God. Our knowledge of God is real knowledge in revelation and our words are made to correspond with the divine reality in an analogy of faith. The point of the Bultmannian demythologising programme was to claim the real existence of God beyond our faculties. We cannot simply define God as a human ideal of existence or a focus of values. The theological programme of Bultmann emphasised the notion that we can talk meaningfully of God only insofar as we have existential experience of his intervention. Common to all these twentieth century philosophical, physical and theological positions, is a form of anti-Cartesianism. Consequently, in regard to their epistemology, they can be labelled antirealist. This common insight also made it possible to find a common meeting point between the different disciplines. In this study, the different standpoints from all three areas and the conversations in Göttingen are analysed in the frameworkof realism/antirealism. One of the first tasks in the Göttingen conversations was to analyse the nature of the likeness between the complementary structures inquantum physics introduced by Niels Bohr and the dialectical forms in the Barthian doctrine of God. The reaction against epistemological Cartesianism, metaphysics of substance and deterministic description of reality was the common point of departure for theologians and physicists in the Göttingen discussions. In his complementarity, Bohr anticipated the crossing of traditional epistemic boundaries and the generalisation of epistemological strategies by introducing interpretative procedures across various disciplines.
Resumo:
The purpose of this thesis is to study organizational core values and their application in practice. With the help of literature, the thesis discusses the implementation of core values and the benefits that companies can gain by doing it successfully. Also, ways in which companies can improve their values’ application to their everyday work are presented. The case company’s value implementation is evaluated through a survey research conducted on their employees. The true power of values lies in their application, and therefore, core values should be the basis for all organizational behavior, integrated into everything a company does. Applying values in practice is an ongoing process and companies should continuously work towards creating a more value-based organizational culture. If a company does this effectively, they will most likely become more successful with stakeholders as well as financially. Companies looking to turn their values into actions should start with a self-assessment. Employee surveys are effective in assessing the current level of value implementation, since employees have valuable, first-hand information regarding the situations and behaviors they face in their everyday work. After the self-assessment, things like management commitment, communication, training, and support are key success factors in value implementation.
Resumo:
The present study aimed at evaluating the use of Artificial Neural Network to correlate the values resulting from chemical analyses of samples of coffee with the values of their sensory analyses. The coffee samples used were from the Coffea arabica L., cultivars Acaiá do Cerrado, Topázio, Acaiá 474-19 and Bourbon, collected in the southern region of the state of Minas Gerais. The chemical analyses were carried out for reducing and non-reducing sugars. The quality of the beverage was evaluated by sensory analysis. The Artificial Neural Network method used values from chemical analyses as input variables and values from sensory analysis as output values. The multiple linear regression of sensory analysis values, according to the values from chemical analyses, presented a determination coefficient of 0.3106, while the Artificial Neural Network achieved a level of 80.00% of success in the classification of values from the sensory analysis.
Resumo:
The purpose of this thesis is to study factors that have an impact on the company’s capabilities to identify and analyze the value of digitalization of services during the early stages of service development process and evaluate them from the perspective of a case company. The research problem was defined: “How digitalization of services affects delivering the services of the future?” The research method of this thesis was based on the qualitative case study which aimed to study both company’s and customer’s set of values. The study included a literature review and a development study. The empirical research part consisted of analyzing three existing services, specifying a new digital service concept and its feasibility analysis as part of a business requirement phase. To understand the set of values, 10 stakeholder interviews were conducted and earlier customer surveys were utilized, and additionally, a number of meetings were conducted with the case company representatives to develop service concept, and evaluate the findings. The impact of the early stages of service development process discovered to reflect directly in the capabilities of the case company to identify and create customer value were related to the themes presented in the literature review. In order to specify the value achieved from the digitalization the following areas of strategic background elements were deepened during the study: Innovations, customer understanding and business service. Based on the findings, the study aims to enhance the case company’s capability to identify and evaluate the impact of the digitalization in delivering services of the future. Recognizing the value of digital service before the beginning of the development project is important to the businesses of both customer and provider. By exploring the various levels of digitalization one can get the overall picture of the value gained from utilizing digital opportunities. From the development perspective, the process of reviewing and discovering the most promising opportunities and solutions is the key step in order to deliver superior services. Ultimately, a company should understand the value outcome determination of the individual services as well as their digital counterparts.