801 resultados para Computational Thinking
Resumo:
Dynamic core-shell nanoparticles have received increasing attention in recent years. This paper presents a detailed study of Au-Hg nanoalloys, whose composing elements show a large difference in cohesive energy. A simple method to prepare Au@Hg particles with precise control over the composition up to 15 atom% mercury is introduced, based on reacting a citrate stabilized gold sol with elemental mercury. Transmission electron microscopy shows an increase of particle size with increasing mercury content and, together with X-ray powder diffraction, points towards the presence of a core-shell structure with a gold core surrounded by an Au-Hg solid solution layer. The amalgamation process is described by pseudo-zero-order reaction kinetics, which indicates slow dissolution of mercury in water as the rate determining step, followed by fast scavenging by nanoparticles in solution. Once adsorbed at the surface, slow diffusion of Hg into the particle lattice occurs, to a depth of ca. 3 nm, independent of Hg concentration. Discrete dipole approximation calculations relate the UV-vis spectra to the microscopic details of the nanoalloy structure. Segregation energies and metal distribution in the nanoalloys were modeled by density functional theory calculations. The results indicate slow metal interdiffusion at the nanoscale, which has important implications for synthetic methods aimed at core-shell particles.
Resumo:
Background The reduction in the amount of food available for European avian scavengers as a consequence of restrictive public health policies is a concern for managers and conservationists. Since 2002, the application of several sanitary regulations has limited the availability of feeding resources provided by domestic carcasses, but theoretical studies assessing whether the availability of food resources provided by wild ungulates are enough to cover energetic requirements are lacking. Methodology/Findings We assessed food provided by a wild ungulate population in two areas of NE Spain inhabited by three vulture species and developed a P System computational model to assess the effects of the carrion resources provided on their population dynamics. We compared the real population trend with to a hypothetical scenario in which only food provided by wild ungulates was available. Simulation testing of the model suggests that wild ungulates constitute an important food resource in the Pyrenees and the vulture population inhabiting this area could grow if only the food provided by wild ungulates would be available. On the contrary, in the Pre-Pyrenees there is insufficient food to cover the energy requirements of avian scavenger guilds, declining sharply if biomass from domestic animals would not be available. Conclusions/Significance Our results suggest that public health legislation can modify scavenger population trends if a large number of domestic ungulate carcasses disappear from the mountains. In this case, food provided by wild ungulates could be not enough and supplementary feeding could be necessary if other alternative food resources are not available (i.e. the reintroduction of wild ungulates), preferably in European Mediterranean scenarios sharing similar and socio-economic conditions where there are low densities of wild ungulates. Managers should anticipate the conservation actions required by assessing food availability and the possible scenarios in order to make the most suitable decisions.
Resumo:
We previously reported that nuclear grade assignment of prostate carcinomas is subject to a cognitive bias induced by the tumor architecture. Here, we asked whether this bias is mediated by the non-conscious selection of nuclei that "match the expectation" induced by the inadvertent glance at the tumor architecture. 20 pathologists were asked to grade nuclei in high power fields of 20 prostate carcinomas displayed on a computer screen. Unknown to the pathologists, each carcinoma was shown twice, once before a background of a low grade, tubule-rich carcinoma and once before the background of a high grade, solid carcinoma. Eye tracking allowed to identify which nuclei the pathologists fixated during the 8 second projection period. For all 20 pathologists, nuclear grade assignment was significantly biased by tumor architecture. Pathologists tended to fixate on bigger, darker, and more irregular nuclei when those were projected before kigh grade, solid carcinomas than before low grade, tubule-rich carcinomas (and vice versa). However, the morphometric differences of the selected nuclei accounted for only 11% of the architecture-induced bias, suggesting that it can only to a small part be explained by the unconscious fixation on nuclei that "match the expectation". In conclusion, selection of « matching nuclei » represents an unconscious effort to vindicate the gravitation of nuclear grades towards the tumor architecture.
Resumo:
Collision-induced dissociation (CID) of peptides using tandem mass spectrometry (MS) has been used to determine the identity of peptides and other large biological molecules. Mass spectrometry (MS) is a useful tool for determining the identity of molecules based on their interaction with electromagnetic fields. If coupled with another method like infrared (IR) vibrational spectroscopy, MS can provide structural information, but in its own right, MS can only provide the mass-to-charge (m/z) ratio of the fragments produced, which may not be enough information to determine the mechanism of the collision-induced dissociation (CID) of the molecule. In this case, theoretical calculations provide a useful companion for MS data and yield clues about the energetics of the dissociation. In this study, negative ion electrospray tandem MS was used to study the CID of the deprotonated dipeptide glycine-serine (Gly-Ser). Though negative ion MS is not as popular a choice as positive ion MS, studies by Bowie et al. show that it yields unique clues about molecular structure which complement positive ion spectroscopy, such as characteristic fragmentations like the loss of formaldehyde from the serine residue.2 The increase in the collision energy in the mass spectrometer alters the flexibility of the dipeptide backbone, enabling isomerizations (reactions not resulting in a fragment loss) and dissociations to take place. The mechanism of the CID of Gly-Ser was studied using two computational methods, B3LYP/6-311+G* and M06-2X/6-311++G**. The main pathway for molecular dissociation was analyzed in 5 conformers in an attempt to verify the initial mechanism proposed by Dr. James Swan after examination of the MS data. The results suggest that the loss of formaldehyde from serine, which Bowie et al. indicates is a characteristic of the presence of serine in a protein residue, is an endothermic reaction that is made possible by the conversion of the translational energy of the ion into internal energy as the ion collides with the inert collision gas. It has also been determined that the M06-2X functional¿s improved description of medium and long-range correlation makes it more effective than the B3LYP functional at finding elusive transition states. M06-2X also more accurately predicts the energy of those transition states than does B3LYP. A second CID mechanism, which passes through intermediates with the same m/z ratio as the main pathway for molecular dissociation, but different structures, including a diketopiperazine intermediate, was also studied. This pathway for molecular dissociation was analyzed with 3 conformers and the M06-2X functional, due to its previously determined effectiveness. The results suggest that the latter pathway, which meets the same intermediate masses as the first mechanism, is lower in overall energy and therefore a more likely pathway of dissociation than the first mechanism.
Resumo:
Cold-formed steel (CFS) combined with wood sheathing, such as oriented strand board (OSB), forms shear walls that can provide lateral resistance to seismic forces. The ability to accurately predict building deformations in damaged states under seismic excitations is a must for modern performance-based seismic design. However, few static or dynamic tests have been conducted on the non-linear behavior of CFS shear walls. Thus, the purpose of this research work is to provide and demonstrate a fastener-based computational model of CFS wall models that incorporates essential nonlinearities that may eventually lead to improvement of the current seismic design requirements. The approach is based on the understanding that complex interaction of the fasteners with the sheathing is an important factor in the non-linear behavior of the shear wall. The computational model consists of beam-column elements for the CFS framing and a rigid diaphragm for the sheathing. The framing and sheathing are connected with non-linear zero-length fastener elements to capture the OSB sheathing damage surrounding the fastener area. Employing computational programs such as OpenSees and MATLAB, 4 ft. x 9 ft., 8 ft. x 9 ft. and 12 ft. x 9 ft. shear wall models are created, and monotonic lateral forces are applied to the computer models. The output data are then compared and analyzed with the available results of physical testing. The results indicate that the OpenSees model can accurately capture the initial stiffness, strength and non-linear behavior of the shear walls.
Computational Fluid Dynamics and Its Impact on Flow Measurements Using Phase-Contrast MR-Angiography
Resumo:
The group analysed some syntactic and phonological phenomena that presuppose the existence of interrelated components within the lexicon, which motivate the assumption that there are some sublexicons within the global lexicon of a speaker. This result is confirmed by experimental findings in neurolinguistics. Hungarian speaking agrammatic aphasics were tested in several ways, the results showing that the sublexicon of closed-class lexical items provides a highly automated complex device for processing surface sentence structure. Analysing Hungarian ellipsis data from a semantic-syntactic aspect, the group established that the lexicon is best conceived of being as split into at least two main sublexicons: the store of semantic-syntactic feature bundles and a separate store of sound forms. On this basis they proposed a format for representing open-class lexical items whose meanings are connected via certain semantic relations. They also proposed a new classification of verbs to account for the contribution of the aspectual reading of the sentence depending on the referential type of the argument, and a new account of the syntactic and semantic behaviour of aspectual prefixes. The partitioned sets of lexical items are sublexicons on phonological grounds. These sublexicons differ in terms of phonotactic grammaticality. The degrees of phonotactic grammaticality are tied up with the problem of psychological reality, of how many degrees of this native speakers are sensitive to. The group developed a hierarchical construction network as an extension of the original General Inheritance Network formalism and this framework was then used as a platform for the implementation of the grammar fragments.
Resumo:
Post-soviet countries are in the process of transformation from a totalitarian order to a democratic one, a transformation which is impossible without a profound shift in people's way of thinking. The group set themselves the task of determining the essence of this shift. Using a multidisciplinary approach, they looked at concrete ways of overcoming the totalitarian mentality and forming that necessary for an open democratic society. They studied the contemporary conceptions of tolerance and critical thinking and looked for new foundations of criticism, especially in hermeneutics. They then sought to substantiate the complementary relation between tolerance and criticism in the democratic way of thinking and to prepare a a syllabus for teaching on the subject in Ukrainian higher education. In a philosophical exploration of tolerance they began with relgious tolerance as its first and most important form. Political and social interests often lay at the foundations of religious intolerance and this implicitly comprised the transition to religious tolerance when conditions changed. Early polytheism was more or less indifferent to dogmatic deviations but monotheism is intolerant of heresies. The damage wrought by the religious wars of the Reformations transformed tolerance into a value. They did not create religious tolerance but forced its recognition as a positive phenomenon. With the weakening of religious institutions in the modern era, the purely political nature of many conflicts became evident and this stimulated the extrapolation of tolerance into secular life. Each historical era has certain acts and operations which may be interpreted as tolerant and these can be classified as to whether or not they are based on the conscious following of the principle of tolerance. This criterion requires the separation of the phenomenon of tolerance from its concept and from tolerance as a value. Only the conjunction of a concept of tolerance with a recognition of its value can transform it into a principle dictating a norm of conscious behaviour. The analysis of the contemporary conception of tolerance focused on the diversity of the concept and concluded that the notions used cannot be combined in the framework of a single more or less simple classification, as the distinctions between them are stimulated by the complexity of the realty considered and the variety of its manifestations. Notions considered in relation to tolerance included pluralism, respect and particular-universal. The rationale of tolerance was also investigated and the group felt that any substantiation of the principle of tolerance must take into account human beings' desire for knowledge. Before respecting or being tolerant of another person different from myself, I should first know where the difference lies, so knowledge is a necessary condition of tolerance.The traditional division of truth into scientific (objective and unique) and religious, moral, political (subjective and so multiple) intensifies the problem of the relationship between truth and tolerance. Science was long seen as a field of "natural" intolerance whereas the validity of tolerance was accepted in other intellectual fields. As tolerance eemrges when there is difference and opposition, it is essentially linked with rivaly and there is a a growing recognition today that unlimited rivalry is neither able to direct the process of development nor to act as creative matter. Social and economic reality has led to rivalry being regulated by the state and a natural requirement of this is to associate tolerance with a special "purified" form of rivalry, an acceptance of the actiivity of different subjects and a specification of the norms of their competition. Tolerance and rivalry should therefore be subordinate to a degree of discipline and the group point out that discipline, including self-discipline, is a regulator of the balance between them. Two problematic aspects of tolerance were identified: why something traditionally supposed to have no positive content has become a human activity today, and whether tolerance has full-scale cultural significance. The resolution of these questions requires a revision of the phenomenon and conception of tolerance to clarify its immanent positive content. This involved an investigation of the contemporary concept of tolerance and of the epistemological foundations of a negative solution of tolerance in Greek thought. An original soution to the problem of the extrapolation of tolerance to scientific knowledge was proposed based on the Duhem-Quine theses and conceptiion of background knowledge. In this way tolerance as a principle of mutual relations between different scientific positions gains an essential epistemological rationale and so an important argument for its own universal status. The group then went on to consider the ontological foundations for a positive solution of this problem, beginning with the work of Poincare and Reichenbach. The next aspect considered was the conceptual foundations of critical thinking, looking at the ideas of Karl Popper and St. Augustine and at the problem of the demarcation line between reasonable criticism and apologetic reasoning. Dogmatic and critical thinking in a political context were also considered, before an investigation of critical thinking's foundations. As logic is essential to critical thinking, the state of this discipline in Ukrainian and Russian higher education was assessed, together with the limits of formal-logical grounds for criticism, the role of informal logical as a basis for critical thinking today, dialectical logic as a foundation for critical thinking and the universality of the contemporary demand for criticism. The search for new foundations of critical thinking covered deconstructivism and critical hermeneutics, including the problem of the author. The relationship between tolerance and criticism was traced from the ancient world, both eastern and Greek, through the transitional community of the Renaissance to the industrial community (Locke and Mill) and the evolution of this relationship today when these are viewed not as moral virtues but as ordinary norms. Tolerance and criticism were discussed as complementary manifestations of human freedom. If the completeness of freedom were accepted it would be impossible to avoid recognition of the natural and legal nature of these manifestations and the group argue that critical tolerance is able to avoid dismissing such negative phenomena as the degradation of taste and manner, pornography, etc. On the basis of their work, the group drew up the syllabus of a course in "Logic with Elements of Critical Thinking, and of a special course on the "Problem of Tolerance".
Resumo:
OBJECTIVE: To describe and evaluate psychosocial factors in nonorganic voice disorders (NVDs). Nonorganic voice disorders are presumed to be the result of increased muscular tension that is caused to varying extents by vocal misuse and emotional stress. It is therefore necessary to include both of these in the diagnosis and treatment of patients with voice disorders. DESIGN: Clinical survey. SETTING: Academic tertiary referral center. PATIENTS: To evaluate psychosocial factors in NVDs, a sample of 74 patients with NVDs was examined psychologically using the Giessen Test and Picture Frustration Test. The results were compared with a control group of 19 patients with an organic dysphonia (vocal cord paralysis). MAIN OUTCOME MEASURES: Six scales of the Giessen Test (social response, dominance, control, underlying mood, permeability, and social potency), 3 reaction types of the Picture Frustration Test (obstacle dominance, ego defense, and need persistence), and 3 aggression categories of the Picture Frustration Test (extrapunitivity, intropunitivity, and impunitivity). RESULTS: The most striking significant difference between the 2 groups was that in conflict situations, patients with NVDs sought a quick solution or expected other people to provide one, which prevented them from understanding the underlying causes of the conflict. CONCLUSIONS: Only if the psychosocial aspects are taken into account can patients with NVD be offered a therapy that treats the causes of the voice disorder. It must be decided individually whether and when a voice training approach or a more psychological-psychotherapeutical approach is preferable.
Resumo:
This paper will explore re-framing historic atrocity and its relationship to Holocaust and Genocide education. The origins of genocide studies and its links to Holocaust studies will be traced to discuss the impact of new scholarship and framings on genocide education in the classroom.
Resumo:
The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. We detail some of the design decisions, software paradigms and operational strategies that have allowed a small number of researchers to provide a wide variety of innovative, extensible, software solutions in a relatively short time. The use of an object oriented programming paradigm, the adoption and development of a software package system, designing by contract, distributed development and collaboration with other projects are elements of this project's success. Individually, each of these concepts are useful and important but when combined they have provided a strong basis for rapid development and deployment of innovative and flexible research software for scientific computation. A primary objective of this initiative is achievement of total remote reproducibility of novel algorithmic research results.