989 resultados para Means-end approach


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Should computer programming be taught within schools of architecture?

Incorporating even low-level computer programming within architectural education curricula is a matter of debate but we have found it useful to do so for two reasons: as an introduction or at least a consolidation of the realm of descriptive geometry and in providing an environment for experimenting in morphological time-based change.

Mathematics and descriptive geometry formed a significant proportion of architectural education until the end of the 19th century. This proportion has declined in contemporary curricula, possibly at some cost for despite major advances in automated manufacture, Cartesian measurement is still the principal ‘language’ with which to describe building for construction purposes. When computer programming is used as a platform for instruction in logic and spatial representation, the waning interest in mathematics as a basis for spatial description can be readdressed using a left-field approach. Students gain insights into topology, Cartesian space and morphology through programmatic form finding, as opposed to through direct manipulation.

In this context, it matters to the architect-programmer how the program operates more than what it does. This paper describes an assignment where students are given a figurative conceptual space comprising the three Cartesian axes with a cube at its centre. Six Phileban solids mark the Cartesian axial limits to the space. Any point in this space represents a hybrid of one, two or three transformations from the central cube towards the various Phileban solids. Students are asked to predict the topological and morphological outcomes of the operations. Through programming, they become aware of morphogenesis and hybridisation. Here we articulate the hypothesis above and report on the outcome from a student group, whose work reveals wider learning opportunities for architecture students in computer programming than conventionally assumed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We describe methods for obtaining a quantitative description of RNA processing at high resolution in budding yeast. As a model gene expression system, we constructed tetON (for induction studies) and tetOFF (for repression, derepression, and RNA degradation studies) yeast strains with a series of reporter genes integrated in the genome under the control of a tetO7 promoter. Reverse transcription and quantitative real-time-PCR (RT-qPCR) methods were adapted to allow the determination of mRNA abundance as the average number of copies per cell in a population. Fluorescence in situ hybridization (FISH) measurements of transcript numbers in individual cells validated the RT-qPCR approach for the average copy-number determination despite the broad distribution of transcript levels within a population of cells. In addition, RT-qPCR was used to distinguish the products of the different steps in splicing of the reporter transcripts, and methods were developed to map and quantify 3′-end cleavage and polyadenylation. This system permits pre-mRNA production, splicing, 3′-end maturation and degradation to be quantitatively monitored with unprecedented kinetic detail, suitable for mathematical modeling. Using this approach, we demonstrate that reporter transcripts are spliced prior to their 3′-end cleavage and polyadenylation, that is, cotranscriptionally.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The introduction of linear functions is the turning point where many students decide if mathematics is useful or not. This means the role of parameters and variables in linear functions could be considered to be ‘threshold concepts’. There is recognition that linear functions can be taught in context through the exploration of linear modelling examples, but this has its limitations. Currently, statistical data is easily attainable, and graphics or computer algebra system (CAS) calculators are common in many classrooms. The use of this technology provides ease of access to different representations of linear functions as well as the ability to fit a least-squares line for real-life data. This means these calculators could support a possible alternative approach to the introduction of linear functions. This study compares the results of an end-oftopic test for two classes of Australian middle secondary students at a regional school to determine if such an alternative approach is feasible. In this study, test questions were grouped by concept and subjected to concept by concept analysis of the means of test results of the two classes. This analysis revealed that the students following the alternative approach demonstrated greater competence with non-standard questions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The paper presents a new methodology to model material failure, in two-dimensional reinforced concrete members, using the Continuum Strong Discontinuity Approach (CSDA). The mixture theory is used as the methodological approach to model reinforced concrete as a composite material, constituted by a plain concrete matrix reinforced with two embedded orthogonal long fiber bundles (rebars). Matrix failure is modeled on the basis of a continuum damage model, equipped with strain softening, whereas the rebars effects are modeled by means of phenomenological constitutive models devised to reproduce the axial non-linear behavior, as well as the bondslip and dowel effects. The proposed methodology extends the fundamental ingredients of the standard Strong Discontinuity Approach, and the embedded discontinuity finite element formulations, in homogeneous materials, to matrix/fiber composite materials, as reinforced concrete. The specific aspects of the material failure modeling for those composites are also addressed. A number of available experimental tests are reproduced in order to illustrate the feasibility of the proposed methodology. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An improved on-site characterization of humic-rich hydrocolloids and their metal species in aquatic environments was the goal of the present approach. Both ligand exchange with extreme chelators ( diethylenetetraaminepentaacetic acid ( DTPA), ethylendiaminetetraacetic acid ( EDTA)) and metal exchange with strongly competitive cations (Cu(II)) were used on-site to characterize the conditional stability and availability of colloidal metal species in a humic-rich German bogwater lake ( Venner Moor, Munsterland). A mobile time-controlled tangential-flow ultrafiltration technique (cut-off: 1 kDa) was applied to differentiate operationally between colloidal metal species and free metal ions, respectively. DOC ( dissolved organic carbon) and metal determinations were carried out off-site using a home-built carbon analyzer and conventional ICP-OES ( inductively-coupled plasma-optical emission spectrometry), respectively. From the metal exchange equilibria obtained on-site the kinetic and thermodynamic stability of the original metal species ( Fe, Mn, Zn) could be characterized. Conditional exchange constants K ex obtained from aquatic metal species and competitive Cu(II) ions follow the order Mn > Zn >> Fe. Obviously, Mn and Zn bound to humic-rich hydrocolloids are very strongly competed by Cu( II) ions, in contrast to Fe which is scarcely exchangeable. The exchange of aquatic metal species (e.g. Fe) by DTPA/EDTA exhibited relatively slow kinetics but rather high metal availabilities, in contrast to their Cu(II) exchange.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In conformational analysis, the systematic search method completely maps the space but suffers from the combinatorial explosion problem because the number of conformations increases exponentially with the number of free rotation angles. This study introduces a new methodology of conformational analysis that controls the combinatorial explosion. It is based on a dimensional reduction of the system through the use of principal component analysis. The results are exactly the same as those obtained for the complete search but, in this case, the number of conformations increases only quadratically with the number of free rotation angles. The method is applied to a series of three drugs: omeprazole. pantoprazole, lansoprazole-benzimidazoles that suppress gastric-acid secretion by means of H(+), K(+)-ATPase enzyme inhibition. (C) 2002 John Wiley Sons. Inc.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this dissertation, I use qualitative research methods to study relationships between compositionists and faculty in other disciplines in the context of cross-curricular literacy (CCL) work. Drawing on a two-year CCL project in the biology department, for which I was a participant observer, I argue that compositionists need to attend more carefully to issues that influence day-to-day interactions with disciplinary faculty in order to develop more meaningful CCL relationships. Toward that end, I offer a revisionary approach to cross-curricular literacy work that cultivates complex relationships by delaying consensus and embracing disconnection and disorientation. More specifically, I employ revisionary stance as a discursive strategy to complicate three key concepts in CCL literature and scholarship—expertise, change, and outcomes. I re-vision three texts produced during my time in the biology department in order to illuminate the complexities of negotiating expertise, recognizing change, and pursuing outcomes in CCL contexts. Given the reciprocal relationship between discursive and material change (Lee), I maintain that revision of CCL discourse can inspire revision on a pedagogical level, shaping how compositionists and disciplinary faculty participate in CCL interactions. Thus, a revisionary approach leads me to conceptualize revisionary pedagogy for cross-curricular literacy work. I theorize revisionary pedagogy as a means of fostering pedagogical relationships in CCL contexts, complicating how relationships are framed in traditional Writing Across the Curriculum/Writing in the Disciplines scholarship. The literature advances three main conceptual models of CCL, each of which embraces expertise, change, and outcomes in ways that sponsor potentially problematic relationships between compositionists and disciplinary faculty. I draw on Composition scholars’ rich conceptualization of revision (Jung; Lee; Welch) and pedagogy (Kameen; Qualley; Stenberg) to challenge the litany of next-best models and imagine alternative possibilities for relationships in CCL contexts. Revisionary pedagogy is a means of approaching material circumstances that reconstitutes how compositionists and disciplinary faculty conceive of and participate in CCL relationships.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although parrots share with corvids and primates many of the traits believed to be associated with advanced cognitive processing, knowledge of parrot cognition is still limited to a few species, none of which are Neotropical. Here we examine the ability of three Neotropical parrot species (Blue-Fronted Amazons, Hyacinth and Lear`s macaws) to spontaneously solve a novel physical problem: the string-pulling test. The ability to pull up a string to obtain out-of-reach food has been often considered a cognitively complex task, as it requires the use of a sequence of actions never previously assembled, along with the ability to continuously monitor string, food and certain body movements. We presented subjects with pulling tasks where we varied the spatial relationship between the strings, the presence of a reward and the physical contact between the string and reward to determine whether (1) string-pulling is goal-oriented in these parrots, (2) whether the string is recognized as a means to obtain the reward and (3) whether subjects can visually determine the continuity between the string and the reward, selecting only those strings for which no physical gaps between string and reward were present. Our results show that some individuals of all species were able to use the string as a means to reach a specific goal, in this case, the retrieval of the food treat. Also, subjects from both macaw species were able to visually determine the presence of physical continuity between the string and reward, making their choices consistently with the recognition that no gaps should be present between the string and the reward. Our findings highlight the potential of this taxonomic group for the understanding of the underpinnings of cognition in evolutionarily distant groups such as birds and primates.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main aim of this Ph.D. dissertation is the study of clustering dependent data by means of copula functions with particular emphasis on microarray data. Copula functions are a popular multivariate modeling tool in each field where the multivariate dependence is of great interest and their use in clustering has not been still investigated. The first part of this work contains the review of the literature of clustering methods, copula functions and microarray experiments. The attention focuses on the K–means (Hartigan, 1975; Hartigan and Wong, 1979), the hierarchical (Everitt, 1974) and the model–based (Fraley and Raftery, 1998, 1999, 2000, 2007) clustering techniques because their performance is compared. Then, the probabilistic interpretation of the Sklar’s theorem (Sklar’s, 1959), the estimation methods for copulas like the Inference for Margins (Joe and Xu, 1996) and the Archimedean and Elliptical copula families are presented. In the end, applications of clustering methods and copulas to the genetic and microarray experiments are highlighted. The second part contains the original contribution proposed. A simulation study is performed in order to evaluate the performance of the K–means and the hierarchical bottom–up clustering methods in identifying clusters according to the dependence structure of the data generating process. Different simulations are performed by varying different conditions (e.g., the kind of margins (distinct, overlapping and nested) and the value of the dependence parameter ) and the results are evaluated by means of different measures of performance. In light of the simulation results and of the limits of the two investigated clustering methods, a new clustering algorithm based on copula functions (‘CoClust’ in brief) is proposed. The basic idea, the iterative procedure of the CoClust and the description of the written R functions with their output are given. The CoClust algorithm is tested on simulated data (by varying the number of clusters, the copula models, the dependence parameter value and the degree of overlap of margins) and is compared with the performance of model–based clustering by using different measures of performance, like the percentage of well–identified number of clusters and the not rejection percentage of H0 on . It is shown that the CoClust algorithm allows to overcome all observed limits of the other investigated clustering techniques and is able to identify clusters according to the dependence structure of the data independently of the degree of overlap of margins and the strength of the dependence. The CoClust uses a criterion based on the maximized log–likelihood function of the copula and can virtually account for any possible dependence relationship between observations. Many peculiar characteristics are shown for the CoClust, e.g. its capability of identifying the true number of clusters and the fact that it does not require a starting classification. Finally, the CoClust algorithm is applied to the real microarray data of Hedenfalk et al. (2001) both to the gene expressions observed in three different cancer samples and to the columns (tumor samples) of the whole data matrix.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The work investigates the feasibility of a new process aimed at the production of hydrogen with inherent separation of carbon oxides. The process consists in a cycle in which, in the first step, a mixed metal oxide is reduced by ethanol (obtained from biomasses). The reduced metal is then contacted with steam in order to split the water and sequestrating the oxygen into the looping material’s structure. The oxides used to run this thermochemical cycle, also called “steam-iron process” are mixed ferrites in the spinel structure MeFe2O4 (Me = Fe, Co, Ni or Cu). To understand the reactions involved in the anaerobic reforming of ethanol, diffuse reflectance spectroscopy (DRIFTS) was used, coupled with the mass analysis of the effluent, to study the surface composition of the ferrites during the adsorption of ethanol and its transformations during the temperature program. This study was paired with the tests on a laboratory scale plant and the characterization through various techniques such as XRD, Mössbauer spectroscopy, elemental analysis... on the materials as synthesized and at different reduction degrees In the first step it was found that besides the generation of the expected CO, CO2 and H2O, the products of ethanol anaerobic oxidation, also a large amount of H2 and coke were produced. The latter is highly undesired, since it affects the second step, during which water is fed over the pre-reduced spinel at high temperature. The behavior of the different spinels was affected by the nature of the divalent metal cation; magnetite was the oxide showing the slower rate of reduction by ethanol, but on the other hand it was that one which could perform the entire cycle of the process more efficiently. Still the problem of coke formation remains the greater challenge to solve.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Aims Cardiac grafts from non-heartbeating donors (NHBDs) could significantly increase organ availability and reduce waiting-list mortality. Reluctance to exploit hearts from NHBDs arises from obligatory delays in procurement leading to periods of warm ischemia and possible subsequent contractile dysfunction. Means for early prediction of graft suitability prior to transplantation are thus required for development of heart transplantation programs with NHBDs. Methods and Results Hearts (n = 31) isolated from male Wistar rats were perfused with modified Krebs-Henseleit buffer aerobically for 20 min, followed by global, no-flow ischemia (32°C) for 30, 50, 55 or 60 min. Reperfusion was unloaded for 20 min, and then loaded, in working-mode, for 40 min. Left ventricular (LV) pressure was monitored using a micro-tip pressure catheter introduced via the mitral valve. Several hemodynamic parameters measured during early, unloaded reperfusion correlated significantly with LV work after 60 min reperfusion (p<0.001). Coronary flow and the production of lactate and lactate dehydrogenase (LDH) also correlated significantly with outcomes after 60 min reperfusion (p<0.05). Based on early reperfusion hemodynamic measures, a composite, weighted predictive parameter, incorporating heart rate (HR), developed pressure (DP) and end-diastolic pressure, was generated and evaluated against the HR-DP product after 60 min of reperfusion. Effective discriminating ability for this novel parameter was observed for four HR*DP cut-off values, particularly for ≥20 *103 mmHg*beats*min−1 (p<0.01). Conclusion Upon reperfusion of a NHBD heart, early evaluation, at the time of organ procurement, of cardiac hemodynamic parameters, as well as easily accessible markers of metabolism and necrosis seem to accurately predict subsequent contractile recovery and could thus potentially be of use in guiding the decision of accepting the ischemic heart for transplantation.