932 resultados para Shared component model
Resumo:
We measure the spectral properties of a representative sub-sample of 187 quasars, drawn from the Parkes Half-Jansky, Flat-radio-spectrum Sample (PHFS). Quasars with a wide range of rest-frame optical/UV continuum slopes are included in the analysis: their colours range over 2 < B-K < 7. We present composite spectra of red and blue sub-samples of the PHFS quasars. and tabulate their emission line properties. The median Hbeta and [0 111] emission line equivalent widths of the red quasar sub-sample are a factor of ten weaker than those of the blue quasar sub-sample. No significant differences are seen between the equivalent width distributions of the C IV, C III] and Mg 11 lines. Both the colours and the emission line equivalent widths of the red quasars can be explained by the addition of a featureless red synchrotron continuum component to an otherwise normal blue quasar spectrum. The red synchrotron component must have a spectrum at least as red as a power-law of the form F-nu proportional to nu(-2.8). The relative strengths of the blue and red components span two orders of magnitude at rest-frame 500 nm. The blue component is weaker relative to the red component in low optical luminosity sources. This suggests that the fraction of accretion energy going into optical emission from the jet is greater in low luminosity quasars. This correlation between colour and luminosity may be of use in cosmological distance scale work. This synchrotron model does not, however, fit similar to10% of the quasars, which have both red colours and high equivalent width emission lines. We hypothesise that these red, strong-lined quasars have intrinsically weak Big Blue Bumps. There is no discontinuity in spectral properties between the BL Lac objects in our sample and the other quasars. BL Lac objects appear to be the red, low equivalent width tail of a continuous distribution. The synchrotron emission component only dominates the spectrum at longer wavelengths, so existing BL Lac surveys will be biased against high redshift objects. This will affect measurements of BL Lac evolution. The blue PHFS quasars have significantly higher equivalent width C IV, Hbeta and [0 111] emission than a matched sample of optically selected QSOs.
Resumo:
The q-deformed supersymmetric t-J model on a semi-infinite lattice is diagonalized by using the level-one vertex operators of the quantum affine superalgebra U-q[sl(2\1)]. We. give the bosonization of the boundary states. We give an integral expression for the correlation functions of the boundary model, and derive the difference equations which they satisfy.
Resumo:
Time availability is a key concept in relation to volunteering, leading to organisations and governments targeting those outside paid work as a potential source of volunteers. It may be that factors such as a growth in female participation in the labour market and an increase in work hours will lead to more people saying they are simply too busy to volunteer This paper discusses how social and economic change, such as changing work patterns, are impacting on time availability. Using the 1997 ABS Time Use data, it identifies a predictive model of spare time by looking at demographic, life stage and employment related variables. Results confirm that those outside paid work, particularly the young, males and those without partners or children, are the groups most likely to have time to spare. These groups do not currently report high rates of volunteering. The paper concludes by questioning the premise that people will volunteer simply because they have time to spare. This is just one component of a range of motivations and factors that influence the decision to volunteer.
Resumo:
Event-related potentials (ERPs) were recorded while subjects made old/new recognition judgments on new unstudied words and old words which had been presented at study either once ('weak') or three times ('strong'). The probability of an 'old' response was significantly higher for strong than weak words and significantly higher for weak than new words. Comparisons were made initially between ERPs to new, weak and strong words, and subsequently between ERPs associated with six strength-by-response conditions. The N400 component was found to be modulated by memory trace strength in a graded manner. Its amplitude was most negative in new word ERPs and most positive in strong word ERPs. This 'N400 strength effect' was largest at the left parietal electrode (in ear-referenced ERPs). The amplitude of the late positive complex (LPC) effect was sensitive to decision accuracy (and perhaps confidence). Its amplitude was larger in ERPs evoked by words attracting correct versus incorrect recognition decisions. The LPC effect had a left > right, centro-parietal scalp topography (in ear-referenced ERPs). Hence, whereas, the majority of previous ERP studies of episodic recognition have interpreted results from the perspective of dual-process models, we provide alternative interpretations of N400 and LPC old/new effects in terms of memory strength and decisional factor(s). (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
This paper is concerned with the use of scientific visualization methods for the analysis of feedforward neural networks (NNs). Inevitably, the kinds of data associated with the design and implementation of neural networks are of very high dimensionality, presenting a major challenge for visualization. A method is described using the well-known statistical technique of principal component analysis (PCA). This is found to be an effective and useful method of visualizing the learning trajectories of many learning algorithms such as back-propagation and can also be used to provide insight into the learning process and the nature of the error surface.
Resumo:
To reconstruct oceanographic variations in the subtropical South Pacific, 271-year long subseasonal time series of Sr/Ca and delta(18)O were generated from a coral growing at Rarotonga (21.5degreesS, 159.5degreesW). In this case, coral Sr/Ca appears to be an excellent proxy for sea surface temperature (SST) and coral delta(18)O is a function of both SST and seawater delta(18)O composition (delta(18)O(sw)). Here, we focus on extracting the delta(18)O(sw) signal from these proxy records. A method is presented assuming that coral Sr/Ca is solely a function of SST and that coral delta(18)O is a function of both SST and delta(18)O(sw). This method separates the effects of delta(18)O(sw) from SST by breaking the instantaneous changes of coral delta(18)O into separate contributions by instantaneous SST and delta(18)O(sw) changes, respectively. The results show that on average delta(18)O(sw) at Rarotonga explains similar to39% of the variance in delta(18)O and that variations in SST explains the remaining similar to61% of delta(18)O variance. Reconstructed delta(18)O(sw) shows systematic increases in summer months (December-February) consistent with the regional pattern of variations in precipitation and evaporation. The delta(18)O(sw) also shows a positive linear correlation with satellite-derived estimated salinity for the period 1980 to 1997 (r = 0.72). This linear correlation between reconstructed delta(18)O(sw) and salinity makes it possible to use the reconstructed delta(18)O(sw) to estimate the past interannual and decadal salinity changes in this region. Comparisons of coral delta(18)O and delta(18)O(sw) at Rarotonga with the Pacific decadal oscillation index suggest that the decadal and interdecadal salinity and SST variability at Rarotonga appears to be related to basin-scale decadal variability in the Pacific. Copyright (C) 2002 Elsevier Science Ltd.
Resumo:
[1] The physical conditions required to provide for the tectonic stability of cratonic crust and for the relative longevity of deep cratonic lithosphere within a dynamic, convecting mantle are explored through a suite of numerical simulations. The simulations allow chemically distinct continents to reside within the upper thermal boundary layer of a thermally convecting mantle layer. A rheologic formulation, which models both brittle and ductile behavior, is incorporated to allow for plate-like behavior and the associated subduction of oceanic lithosphere. Several mechanisms that may stabilize cratons are considered. The two most often invoked mechanisms, chemical buoyancy and/or high viscosity of cratonic root material, are found to be relatively ineffective if cratons come into contact with subduction zones. High root viscosity can provide for stability and longevity but only within a thick root limit in which the thickness of chemically distinct, high-viscosity cratonic lithosphere exceeds the thickness of old oceanic lithosphere by at least a factor of 2. This end-member implies a very thick mechanical lithosphere for cratons. A high brittle yield stress for cratonic lithosphere as a whole, relative to oceanic lithosphere, is found to be an effective and robust means for providing stability and lithospheric longevity. This mode does not require exceedingly deep strength within cratons. A high yield stress for only the crustal or mantle component of the cratonic lithosphere is found to be less effective as detachment zones can then form at the crust-mantle interface which decreases the longevity potential of cratonic roots. The degree of yield stress variations between cratonic and oceanic lithosphere required for stability and longevity can be decreased if cratons are bordered by continental lithosphere that has a relatively low yield stress, i.e., mobile belts. Simulations that combine all the mechanisms can lead to crustal stability and deep root longevity for model cratons over several mantle overturn times, but the dominant stabilizing factor remains a relatively high brittle yield stress for cratonic lithosphere.
Resumo:
The paper presents a theory for modeling flow in anisotropic, viscous rock. This theory has originally been developed for the simulation of large deformation processes including the folding and kinking of multi-layered visco-elastic rock (Muhlhaus et al. [1,2]). The orientation of slip planes in the context of crystallographic slip is determined by the normal vector - the director - of these surfaces. The model is applied to simulate anisotropic mantle convection. We compare the evolution of flow patterns, Nusselt number and director orientations for isotropic and anisotropic rheologies. In the simulations we utilize two different finite element methodologies: The Lagrangian Integration Point Method Moresi et al [8] and an Eulerian formulation, which we implemented into the finite element based pde solver Fastflo (www.cmis.csiro.au/Fastflo/). The reason for utilizing two different finite element codes was firstly to study the influence of an anisotropic power law rheology which currently is not implemented into the Lagrangian Integration point scheme [8] and secondly to study the numerical performance of Eulerian (Fastflo)- and Lagrangian integration schemes [8]. It turned out that whereas in the Lagrangian method the Nusselt number vs time plot reached only a quasi steady state where the Nusselt number oscillates around a steady state value the Eulerian scheme reaches exact steady states and produces a high degree of alignment (director orientation locally orthogonal to velocity vector almost everywhere in the computational domain). In the simulations emergent anisotropy was strongest in terms of modulus contrast in the up and down-welling plumes. Mechanisms for anisotropic material behavior in the mantle dynamics context are discussed by Christensen [3]. The dominant mineral phases in the mantle generally do not exhibit strong elastic anisotropy but they still may be oriented by the convective flow. Thus viscous anisotropy (the main focus of this paper) may or may not correlate with elastic or seismic anisotropy.
Resumo:
Simulations provide a powerful means to help gain the understanding of crustal fault system physics required to progress towards the goal of earthquake forecasting. Cellular Automata are efficient enough to probe system dynamics but their simplifications render interpretations questionable. In contrast, sophisticated elasto-dynamic models yield more convincing results but are too computationally demanding to explore phase space. To help bridge this gap, we develop a simple 2D elastodynamic model of parallel fault systems. The model is discretised onto a triangular lattice and faults are specified as split nodes along horizontal rows in the lattice. A simple numerical approach is presented for calculating the forces at medium and split nodes such that general nonlinear frictional constitutive relations can be modeled along faults. Single and multi-fault simulation examples are presented using a nonlinear frictional relation that is slip and slip-rate dependent in order to illustrate the model.
Resumo:
Almost all leprosy cases reported in industrialized countries occur amongst immigrants or refugees from developing countries where leprosy continues to be an important health issue. Screening for leprosy is an important question for governments in countries with immigration and refugee programmes. A decision analysis framework is used to evaluate leprosy screening. The analysis uses a set of criteria and parameters regarding leprosy screening, and available data to estimate the number of cases which would be detected by a leprosy screening programme of immigrants from countries with different leprosy prevalences, compared with a policy of waiting for immigrants who develop symptomatic clinical diseases to present for health care. In a cohort of 100,000 immigrants from high leprosy prevalence regions (3.6/10,000), screening would detect 32 of the 42 cases which would arise in the destination country over the 14 years after migration; from medium prevalence areas (0.7/10,000) 6.3 of the total 8.1 cases would be detected, and from low prevalence regions (0.2/10,600) 1.8 of 2.3 cases. Using Australian data, the migrant mix would produce 74 leprosy cases from 10 years intake; screening would detect 54, and 19 would be diagnosed subsequently after migration. Screening would only produce significant case-yield amongst immigrants from regions or social groups with high leprosy prevalence. Since the number of immigrants to Australia from countries of higher endemnicity is not large routine leprosy screening would have a small impact on case incidence.
Resumo:
We model a buyer who wishes to combine objects owned by two separate sellers in order to realize higher value. Sellers are able to avoid entering into negotiations with the buyer, so that the order in which they negotiate is endogenous. Holdout occurs if at least one of the sellers is not present in the first round of negotiations. We demonstrate that complementarity of the buyer's technology is a necessary condition for equilibrium holdout. Moreover, a rise in complementarity leads to an increased likelihood of holdout, and an increased efficiency loss. Applications include patents, the land assembly problem, and mergers.
Resumo:
Certification of an ISO 14001 Environmental Management System (EMS) is currently an important requirement for those enterprises wishing to sell their products in the context of a global market. The system`s structure is based on environmental impact evaluation (EIE). However, if an erroneous or inadequate methodology is applied, the entire process may be jeopardized. Many methodologies have been developed for making of EIEs, some of them are fairly complex and unsuitable for EMS implementation in an organizational context, principally when small and medium size enterprises (SMEs) are involved. The proposed methodology for EIE is part of a model for implementing EMS. The methodological approach used was a qualitative exploratory research method based upon sources of evidence such as document analyses, semi-structured interviews and participant observations. By adopting a cooperative implementation model based on the theory of system engineering, difficulties relating to implementation of the sub-system were overcome thus encouraging SMEs to implement EMS. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Reasons for performing study: Light microscopical studies show that the key lesion of laminitis is separation at the hoof lamellar dermal-epidermal interface. More precise knowledge of the damage occurring in the lamellar basement membrane zone may result if laminitis affected tissue is examined with the transmission electron microscope. This could lead to better understanding of the pathogenesis of lesions and the means of treatment or prevention. Objectives: To investigate the ultrastructure of acute laminitis as disease of greater severity is induced by increasing oligofructose (OF) dosage. Methods: Three pairs of normal horses, dosed with OF at 7.5, 10 and 12.5 g/kg bwt via nasogastric intubation, developed laminitis 48 h later. Following euthanasia, their forefeet were processed for transmission electron microscopy. Lamellar basal cell hemidesmosome (HD) numbers and the distance between the basal cell plasmalemma and the lamina densa of the basement membrane were estimated and compared to control tissue. Results: Increasing OF dosage caused greater HD loss and more severe laminitis. The characteristic separation of the basement membrane, cytoskeleton failure and rounded basal cell nuclei results from combined HD dysassembly and anchoring filament failure. Conclusions: Without properly assembled HDs, dysadhesion between the lamina densa of the basement membrane (BM) and epidermal basal cells occurs, emphasising the fundamental importance of HDs in maintaining attachment at the lamellar interface. Medical conditions that trigger lamellar matrix metalloproteinase (MMP) activation and/or compromise entry of glucose into lamellar basal cells appear to promote loss and failure of HDs and, therefore, laminitis development. Potential relevance: A correlation between lameness severity and escalating loss of lamellar HDs now exists. Therapy aimed at protecting the lamellar environment from haematogenous delivery of MMP activators or from glucose deprivation may control laminitis development.