46 resultados para eletromechanical analogy
Resumo:
This paper presents a reflective narrative of the process of designing a PhD project. Using the analogy of the play 'One Man, Two Guvnors' , this paper discusses the tensions a beginning researcher faces in reconciling her own vision for a project with the academic demands of doctoral-level study. Focusing on an ethnographic study of a reading group for visually-impaired people, the paper explores how the researcher's developing understanding of the considerations necessary when working with disabled people impacted on the research design. In particular, it focuses on the conflict faced by doctoral students when working in a paradigm that requires actively involving research participants, thereby relinquishing some control over the project. The aim of the paper is to provide an honest narrative that will resonate with other beginning researchers.
Resumo:
With the advent of mass digitization projects, such as the Google Book Search, a peculiar shift has occurred in the way that copyright works are dealt with. Contrary to what has so far been the case, works are turned into machine-readable data to be automatically processed for various purposes without the expression of works being displayed to the public. In the Google Book Settlement Agreement, this new kind of usage is referred to as ‘non-display uses’ of digital works. The legitimacy of these uses has not yet been tested by Courts and does not comfortably fit in the current copyright doctrine, plainly because the works are not used as works but as something else, namely as data. Since non-display uses may prove to be a very lucrative market in the near future, with the potential to affect the way people use copyright works, we examine non-display uses under the prism of copyright principles to determine the boundaries of their legitimacy. Through this examination, we provide a categorization of the activities carried out under the heading of ‘non-display uses’, we examine their lawfulness under the current copyright doctrine and approach the phenomenon from the spectrum of data protection law that could apply, by analogy, to the use of copyright works as processable data.
Resumo:
In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.
Resumo:
The membrane-bound form of mammalian aminopeptidase P (AP-P; EC 3.4. 11.9) is a mono-zinc-containing enzyme that lacks any of the typical metal binding motifs found in other zinc metalloproteases. To identify residues involved in metal binding and catalysis, sequence and structural information was used to align the sequence of porcine membrane-bound AP-P with other members of the peptidase clan MG, including Escherichia coli AP-P and methionyl aminopeptidases. Residues predicted to be critical for activity were mutated and the resultant proteins were expressed in COS-1 cells. Immunoelectrophoretic blot analysis was used to compare the levels of expression of the mutant proteins, and their ability to hydrolyze bradykinin and Gly-Pro-hydroxyPro was assessed. Asp449, Asp460, His523, Glu554, and Glu568 are predicted to serve as metal ion ligands in the active site, and mutagenesis of these residues resulted in fully glycosylated proteins that were catalytically inactive. Mutation of His429 and His532 also resulted in catalytically inactive proteins, and these residues, by analogy with E. coli AP-P, are likely to play a role in shuttling protons during catalysis. These studies indicate that mammalian membrane-bound AP-P has an active-site configuration similar to that of other members of the peptidase clan MG, which is compatible with either a dual metal ion model or a single metal ion in the active site. The latter model is consistent, however, with the known metal stoichiometry of both the membrane-bound and cytosolic forms of AP-P and with a recently proposed model for methionyl aminopeptidase.
Resumo:
The relevance of chaotic advection to stratospheric mixing and transport is addressed in the context of (i) a numerical model of forced shallow-water flow on the sphere, and (ii) a middle-atmosphere general circulation model. It is argued that chaotic advection applies to both these models if there is suitable large-scale spatial structure in the velocity field and if the velocity field is temporally quasi-regular. This spatial structure is manifested in the form of “cat’s eyes” in the surf zone, such as are commonly seen in numerical simulations of Rossby wave critical layers; by analogy with the heteroclinic structure of a temporally aperiodic chaotic system the cat’s eyes may be thought of as an “organizing structure” for mixing and transport in the surf zone. When this organizing structure exists, Eulerian and Lagrangian autocorrelations of the velocity derivatives indicate that velocity derivatives decorrelate more rapidly along particle trajectories than at fixed spatial locations (i.e., the velocity field is temporally quasi-regular). This phenomenon is referred to as Lagrangian random strain.
Resumo:
There exists a well-developed body of theory based on quasi-geostrophic (QG) dynamics that is central to our present understanding of large-scale atmospheric and oceanic dynamics. An important question is the extent to which this body of theory may generalize to more accurate dynamical models. As a first step in this process, we here generalize a set of theoretical results, concerning the evolution of disturbances to prescribed basic states, to semi-geostrophic (SG) dynamics. SG dynamics, like QG dynamics, is a Hamiltonian balanced model whose evolution is described by the material conservation of potential vorticity, together with an invertibility principle relating the potential vorticity to the advecting fields. SG dynamics has features that make it a good prototype for balanced models that are more accurate than QG dynamics. In the first part of this two-part study, we derive a pseudomomentum invariant for the SG equations, and use it to obtain: (i) linear and nonlinear generalized Charney–Stern theorems for disturbances to parallel flows; (ii) a finite-amplitude local conservation law for the invariant, obeying the group-velocity property in the WKB limit; and (iii) a wave-mean-flow interaction theorem consisting of generalized Eliassen–Palm flux diagnostics, an elliptic equation for the stream-function tendency, and a non-acceleration theorem. All these results are analogous to their QG forms. The pseudomomentum invariant – a conserved second-order disturbance quantity that is associated with zonal symmetry – is constructed using a variational principle in a similar manner to the QG calculations. Such an approach is possible when the equations of motion under the geostrophic momentum approximation are transformed to isentropic and geostrophic coordinates, in which the ageostrophic advection terms are no longer explicit. Symmetry-related wave-activity invariants such as the pseudomomentum then arise naturally from the Hamiltonian structure of the SG equations. We avoid use of the so-called ‘massless layer’ approach to the modelling of isentropic gradients at the lower boundary, preferring instead to incorporate explicitly those boundary contributions into the wave-activity and stability results. This makes the analogy with QG dynamics most transparent. This paper treats the f-plane Boussinesq form of SG dynamics, and its recent extension to β-plane, compressible flow by Magnusdottir & Schubert. In the limit of small Rossby number, the results reduce to their respective QG forms. Novel features particular to SG dynamics include apparently unnoticed lateral boundary stability criteria in (i), and the necessity of including additional zonal-mean eddy correlation terms besides the zonal-mean potential vorticity fluxes in the wave-mean-flow balance in (iii). In the companion paper, wave-activity conservation laws and stability theorems based on the SG form of the pseudoenergy are presented.
Resumo:
A distinction between the domestic and commercial context is commonly drawn in property law discourse and has been brought into focus by three recent House of Lords' decisions. The thesis of this paper is that while the distinction is a useful explanatory tool, it runs into difficulties when given legal effect by the courts. There is a definitional problem in understanding what is included within each context. Indeed, the distinction assumes the existence of a dichotomy when, in fact, the domestic and commercial spheres are better seen as a continuum. In Stack v Dowden, the majority of the House of Lords gave legal effect to context and considered that different rules should apply to determine ownership of the home. This paper locates its decision in the broader debate on judicial restraint and creativity. By analogy with current discussion of due deference in public law, it is suggested that, in light of the policy issues involved and the broader ramifications of the decision, insufficient justification was given for the approach adopted by the majority.
Resumo:
Sea ice friction models are necessary to predict the nature of interactions between sea ice floes. These interactions are of interest on a range of scales, for example, to predict loads on engineering structures in icy waters or to understand the basin-scale motion of sea ice. Many models use Amonton's friction law due to its simplicity. More advanced models allow for hydrodynamic lubrication and refreezing of asperities; however, modeling these processes leads to greatly increased complexity. In this paper we propose, by analogy with rock physics, that a rate- and state-dependent friction law allows us to incorporate memory (and thus the effects of lubrication and bonding) into ice friction models without a great increase in complexity. We support this proposal with experimental data on both the laboratory (∼0.1 m) and ice tank (∼1 m) scale. These experiments show that the effects of static contact under normal load can be incorporated into a friction model. We find the parameters for a first-order rate and state model to be A = 0.310, B = 0.382, and μ0 = 0.872. Such a model then allows us to make predictions about the nature of memory effects in moving ice-ice contacts.
Resumo:
This article considers cinematic time in James Benning’s film, casting a glance (2007), in relation to its subject, Robert Smithson’s 1970 earthwork Spiral Jetty, and his film of the same name. The radicalism of Smithson’s thinking on time has been widely acknowledged, and his influence continues to pervade contemporary artistic practice. The relationship of Benning’s films with this legacy may appear somewhat oblique, given their apparent phenomenological rendition of ‘real time’. However, closer examination of Benning’s formal strategies reveals a more complex temporal construction, characterized by uncertain intervals that interrupt the folding of cinematic time into the flow of consciousness. Smithson’s film uses cinematic analogy to gesture towards vast reaches of geological time; Benning’s film creates a simulated timescale to evoke the short history of the earthwork itself. Smithson’s embrace of the entropic was a counter-cultural stance at the end of the1960s, but under the shadow of ecological disaster, this orientation has come to appear melancholy and romantic rather than radical. Benning’s film returns the jetty to anthropic time, but raises questions about the ways we inhabit time. His practice of working with ‘borrowed time’ is particularly suited to the cultural and historical moment of his later work.
Resumo:
The structural analogy between Ni-doped greigite minerals (Fe3S4) and the (Fe,Ni)S clusters present in biological enzymes has led to suggestions that these minerals could have acted as catalysts for the origin of life. However, little is known about the distribution and stability of Ni dopants in the greigite structure. We present here a theoretical investigation of mixed thiospinels (Fe1
Resumo:
Simultaneous scintillometer measurements at multiple wavelengths (pairing visible or infrared with millimetre or radio waves) have the potential to provide estimates of path-averaged surface fluxes of sensible and latent heat. Traditionally, the equations to deduce fluxes from measurements of the refractive index structure parameter at the two wavelengths have been formulated in terms of absolute humidity. Here, it is shown that formulation in terms of specific humidity has several advantages. Specific humidity satisfies the requirement for a conserved variable in similarity theory and inherently accounts for density effects misapportioned through the use of absolute humidity. The validity and interpretation of both formulations are assessed and the analogy with open-path infrared gas analyser density corrections is discussed. Original derivations using absolute humidity to represent the influence of water vapour are shown to misrepresent the latent heat flux. The errors in the flux, which depend on the Bowen ratio (larger for drier conditions), may be of the order of 10%. The sensible heat flux is shown to remain unchanged. It is also verified that use of a single scintillometer at optical wavelengths is essentially unaffected by these new formulations. Where it may not be possible to reprocess two-wavelength results, a density correction to the latent heat flux is proposed for scintillometry, which can be applied retrospectively to reduce the error.
Resumo:
We construct a two-variable model which describes the interaction between local baroclinicity and eddy heat flux in order to understand aspects of the variance in storm tracks. It is a heuristic model for diabatically forced baroclinic instability close to baroclinic neutrality. The two-variable model has the structure of a nonlinear oscillator. It exhibits some realistic properties of observed storm track variability, most notably the intermittent nature of eddy activity. This suggests that apparent threshold behaviour can be more accurately and succinctly described by a simple nonlinearity. An analogy is drawn with triggering of convective events.
Resumo:
This paper considers variations of a neuron pool selection method known as Affordable Neural Network (AfNN). A saliency measure, based on the second derivative of the objective function is proposed to assess the ability of a trained AfNN to provide neuronal redundancy. The discrepancies between the various affordability variants are explained by correlating unique sub group selections with relevant saliency variations. Overall this study shows that the method in which neurons are selected from a pool is more relevant to how salient individual neurons are, than how often a particular neuron is used during training. The findings herein are relevant to not only providing an analogy to brain function but, also, in optimizing the way a neural network using the affordability method is trained.
Resumo:
If we use the analogy of a virus as a living entity, then the replicative organelle is the body where its metabolic and reproductive activities are concentrated. Recent studies have illuminated the intricately complex replicative organelles of coronaviruses, a group that includes the largest known RNA virus genomes. This review takes a virus-centric look at the coronavirus replication transcription complex organelle in the context of the wider world of positive sense RNA viruses, examining how the mechanisms of protein expression and function act to produce the factories that power the viral replication cycle.
Resumo:
Climate controls fire regimes through its influence on the amount and types of fuel present and their dryness. CO2 concentration constrains primary production by limiting photosynthetic activity in plants. However, although fuel accumulation depends on biomass production, and hence on CO2 concentration, the quantitative relationship between atmospheric CO2 concentration and biomass burning is not well understood. Here a fire-enabled dynamic global vegetation model (the Land surface Processes and eXchanges model, LPX) is used to attribute glacial–interglacial changes in biomass burning to an increase in CO2, which would be expected to increase primary production and therefore fuel loads even in the absence of climate change, vs. climate change effects. Four general circulation models provided last glacial maximum (LGM) climate anomalies – that is, differences from the pre-industrial (PI) control climate – from the Palaeoclimate Modelling Intercomparison Project Phase~2, allowing the construction of four scenarios for LGM climate. Modelled carbon fluxes from biomass burning were corrected for the model's observed prediction biases in contemporary regional average values for biomes. With LGM climate and low CO2 (185 ppm) effects included, the modelled global flux at the LGM was in the range of 1.0–1.4 Pg C year-1, about a third less than that modelled for PI time. LGM climate with pre-industrial CO2 (280 ppm) yielded unrealistic results, with global biomass burning fluxes similar to or even greater than in the pre-industrial climate. It is inferred that a substantial part of the increase in biomass burning after the LGM must be attributed to the effect of increasing CO2 concentration on primary production and fuel load. Today, by analogy, both rising CO2 and global warming must be considered as risk factors for increasing biomass burning. Both effects need to be included in models to project future fire risks.