879 resultados para whether time may be extended after order filed
Resumo:
Evaluation of protein and metabolite expression patterns in blood using mass spectrometry and high-throughput antibody-based screening platforms has potential for the discovery of new biomarkers for managing breast cancer patient treatment. Previously identified blood-based breast cancer biomarkers, including cancer antigen 15.3 (CA15-3) are useful in combination with imaging (computed tomography scans, magnetic resonance imaging, X-rays) and physical examination for monitoring tumour burden in advanced breast cancer patients. However, these biomarkers suffer from insufficient levels of accuracy and with new therapies available for the treatment of breast cancer, there is an urgent need for reliable, non-invasive biomarkers that measure tumour burden with high sensitivity and specificity so as to provide early warning of the need to switch to an alternative treatment. The aim of this study was to identify a biomarker signature of tumour burden using cancer and non-cancer (healthy controls/non-malignant breast disease) patient samples. Results demonstrate that combinations of three candidate biomarkers from Glutamate, 12-Hydroxyeicosatetraenoic acid, Beta-hydroxybutyrate, Factor V and Matrix metalloproteinase-1 with CA15-3, an established biomarker for breast cancer, were found to mirror tumour burden, with AUC values ranging from 0.71 to 0.98 when comparing non-malignant breast disease to the different stages of breast cancer. Further validation of these biomarker panels could potentially facilitate the management of breast cancer patients, especially to assess changes in tumour burden in combination with imaging and physical examination.
Resumo:
The shooting of a social worker by a client on the Gold Coast in 1991 graphically illustrated the issue of physical assaults and violence by service users against social workers. In this article we look at the incidence of physical assault, threats of violence, abuse of agency property and verbal abuse to social and other welfare workers by clients, using data from a survey in Melbourne. We then look at probable causes of menacing behaviour, such as issues involved in work with involuntary clients' and we discuss options for preventing and coping with violence and abuse in the welfare work place.
Resumo:
An assessment of the status of the Atlantic stock of red drum is conducted using recreational and commercial data from 1986 through 1998. This assessment updates data and analyses from the 1989, 1991, 1992 and 1995 stock assessments on Atlantic coast red drum (Vaughan and Helser, 1990; Vaughan 1992; 1993; 1996). Since 1981, coastwide recreational catches ranged between 762,300 pounds in 1980 and 2,623,900 pounds in 1984, while commercial landings ranged between 60,900 pounds in 1997 and 422,500 pounds in 1984. In weight of fish caught, Atlantic red drum constitute predominantly a recreational fishery (ranging between 85 and 95% during the 1990s). Commercially, red drum continue to be harvested as part of mixed species fisheries. Using available length-frequency distributions and age-length keys, recreational and commercial catches are converted to catch in numbers at age. Separable and tuned virtual population analyses are conducted on the catch in numbers at age to obtain estimates of fishing mortality rates and population size (including recruitment to age 1). In tum, these estimates of fishing mortality rates combined with estimates of growth (length and weight), sex ratios, sexual maturity and fecundity are used to estimate yield per recruit, escapement to age 4, and static (or equilibrium) spawning potential ratio (static SPR, based on both female biomass and egg production). Three virtual analysis approaches (separable, spreadsheet, and FADAPT) were applied to catch matrices for two time periods (early: 1986-1991, and late: 1992-1998) and two regions (Northern: North Carolina and north, and Southern: South Carolina through east coast of Florida). Additional catch matrices were developed based on different treatments for the catch-and-release recreationally-caught red drum (B2-type). These approaches included assuming 0% mortality (BASEO) versus 10% mortality for B2 fish. For the 10% mortality on B2 fish, sizes were assumed the same as caught fish (BASEl), or positive difference in size distribution between the early period and the later period (DELTA), or intermediate (PROP). Hence, a total of 8 catch matrices were developed (2 regions, and 4 B2 assumptions for 1986-1998) to which the three VPA approaches were applied. The question of when offshore emigration or reduced availability begins (during or after age 3) continues to be a source of bias that tends to result in overestimates of fishing mortality. Additionally, the continued assumption (Vaughan and Helser, 1990; Vaughan 1992; 1993; 1996) of no fishing mortality on adults (ages 6 and older), causes a bias that results in underestimates of fishing mortality for adult ages (0 versus some positive value). Because of emigration and the effect of the slot limit for the later period, a range in relative exploitations of age 3 to age 2 red drum was considered. Tuning indices were developed from the MRFSS, and state indices for use in the spreadsheet and FADAPT VPAs. The SAFMC Red Drum Assessment Group (Appendix A) favored the FADAPT approach with catch matrix based on DELTA and a selectivity for age 3 relative to age 2 of 0.70 for the northern region and 0.87 for the southern region. In the northern region, estimates of static SPR increased from about 1.3% for the period 1987-1991 to approximately 18% (15% and 20%) for the period 1992-1998. For the southern region, estimates of static SPR increased from about 0.5% for the period 1988-1991 to approximately 15% for the period 1992-1998. Population models used in this assessment (specifically yield per recruit and static spawning potential ratio) are based on equilibrium assumptions: because no direct estimates are available as to the current status of the adult stock, model results imply potential longer term, equilibrium effects. Because current status of the adult stock is unknown, a specific rebuilding schedule cannot be determined. However, the duration of a rebuilding schedule should reflect, in part, a measure of the generation time of the fish species under consideration. For a long-lived, but relatively early spawning, species as red drum, mean generation time would be on the order of 15 to 20 years based on age-specific egg production. Maximum age is 50 to 60 years for the northern region, and about 40 years for the southern region. The ASMFC Red Drum Board's first phase recovery goal of increasing %SPR to at least 10% appears to have been met. (PDF contains 79 pages)
Resumo:
Arid and semiarid landscapes comprise nearly a third of the Earth's total land surface. These areas are coming under increasing land use pressures. Despite their low productivity these lands are not barren. Rather, they consist of fragile ecosystems vulnerable to anthropogenic disturbance.
The purpose of this thesis is threefold: (I) to develop and test a process model of wind-driven desertification, (II) to evaluate next-generation process-relevant remote monitoring strategies for use in arid and semiarid regions, and (III) to identify elements for effective management of the world's drylands.
In developing the process model of wind-driven desertification in arid and semiarid lands, field, remote sensing, and modeling observations from a degraded Mojave Desert shrubland are used. This model focuses on aeolian removal and transport of dust, sand, and litter as the primary mechanisms of degradation: killing plants by burial and abrasion, interrupting natural processes of nutrient accumulation, and allowing the loss of soil resources by abiotic transport. This model is tested in field sampling experiments at two sites and is extended by Fourier Transform and geostatistical analysis of high-resolution imagery from one site.
Next, the use of hyperspectral remote sensing data is evaluated as a substantive input to dryland remote monitoring strategies. In particular, the efficacy of spectral mixture analysis (SMA) in discriminating vegetation and soil types and detennining vegetation cover is investigated. The results indicate that hyperspectral data may be less useful than often thought in determining vegetation parameters. Its usefulness in determining soil parameters, however, may be leveraged by developing simple multispectral classification tools that can be used to monitor desertification.
Finally, the elements required for effective monitoring and management of arid and semiarid lands are discussed. Several large-scale multi-site field experiments are proposed to clarify the role of wind as a landscape and degradation process in dry lands. The role of remote sensing in monitoring the world's drylands is discussed in terms of optimal remote sensing platform characteristics and surface phenomena which may be monitored in order to identify areas at risk of desertification. A desertification indicator is proposed that unifies consideration of environmental and human variables.
Resumo:
We have investigated a resonant refractive nonlinearity in a semiconductor waveguide by measuring intensity dependent phase shifts and bias-dependent recovery times. The measurements were performed on an optimized 750-μm-long AR coated buried heterostructure MQW p-i-n waveguide with a bandedge at 1.48 μm. Figure 1 shows the experimental arrangement. The mode-locked color center laser was tuned to 50 meV beyond the bandedge and 8 ps pulses with peak incident power up to 57 W were coupled into the waveguide. Some residual bandtail absorption remains at this wavelength and this is sufficient to cause carriers to be photogenerated and these give rise to a refractive nonlinearity, predominantly by plasma and bandfilling effects. A Fabry-Perot interferometer is used to measure the spectrum of the light which exits the waveguide. The nonlinearity within the guide causes self phase modulation (SPM) of the light and a study of the spectrum allows information to be recovered on the magnitude and recovery time of the nonlinear phase shift with a reasonable degree of accuracy. SPM spectra were recorded for a variety of pulse energies coupled into he unbiased waveguide. Figure 2 shows the resultant phase shift measured from the SPM spectra as a function of pulse energy. The relationship is a linear one, indicating that no saturation of the nonlinearity occurs for coupled pulse energies up to 230 pJ. A π phase shift, the minimum necessary for an all-optical switch, is obtained for a coupled pulse energy of 57 pJ while the maximum phase shift, 4 π, was measured for 230 pJ. The SPM spectra were highly asymmetric with pulse energy shifted to higher frequencies. Such spectra are characteristic of a slow, negative nonlinearity. This relatively slow speed is expected for the unbiased guide as the recovery time will be of the order of the recombination time of the photogenerated electrons, about 1 ns for InGaAsP material. In order to reduce the recovery time of the nonlinearity, it is necessary to remove the photogenerated carriers from the waveguide by a process other than recombination. One such technique is to apply a reverse bias to the waveguide in order to sweep the carriers out. Figure 3 shows the effect on the recovery time of the nonlinearity of applying reverse bias to the waveguide for 230 pJ coupled power. The recovery time was reduced from one much longer than the length of the pulse, estimated to be about 1 ns, at zero bias to 18 ± 3 ps for a bias voltage greater than -4 V. This compares with a value of 24 ps obtained in a bulk waveguide.
Resumo:
Leber hereditary optic neuropathy (LHON) was the first disease to be linked to the presence of a mitochondrial DNA (mtDNA) mutation. Nowadays over 95% of LHON cases are known to be caused by one of three primary mutations (m.11778G>A, m.14484T>C, and m.34
Resumo:
We have developed a compiler for the lexically-scoped dialect of LISP known as SCHEME. The compiler knows relatively little about specific data manipulation primitives such as arithmetic operators, but concentrates on general issues of environment and control. Rather than having specialized knowledge about a large variety of control and environment constructs, the compiler handles only a small basis set which reflects the semantics of lambda-calculus. All of the traditional imperative constructs, such as sequencing, assignment, looping, GOTO, as well as many standard LISP constructs such as AND, OR, and COND, are expressed in macros in terms of the applicative basis set. A small number of optimization techniques, coupled with the treatment of function calls as GOTO statements, serve to produce code as good as that produced by more traditional compilers. The macro approach enables speedy implementation of new constructs as desired without sacrificing efficiency in the generated code. A fair amount of analysis is devoted to determining whether environments may be stack-allocated or must be heap-allocated. Heap-allocated environments are necessary in general because SCHEME (unlike Algol 60 and Algol 68, for example) allows procedures with free lexically scoped variables to be returned as the values of other procedures; the Algol stack-allocation environment strategy does not suffice. The methods used here indicate that a heap-allocating generalization of the "display" technique leads to an efficient implementation of such "upward funargs". Moreover, compile-time optimization and analysis can eliminate many "funargs" entirely, and so far fewer environment structures need be allocated at run time than might be expected. A subset of SCHEME (rather than triples, for example) serves as the representation intermediate between the optimized SCHEME code and the final output code; code is expressed in this subset in the so-called continuation-passing style. As a subset of SCHEME, it enjoys the same theoretical properties; one could even apply the same optimizer used on the input code to the intermediate code. However, the subset is so chosen that all temporary quantities are made manifest as variables, and no control stack is needed to evaluate it. As a result, this apparently applicative representation admits an imperative interpretation which permits easy transcription to final imperative machine code. These qualities suggest that an applicative language like SCHEME is a better candidate for an UNCOL than the more imperative candidates proposed to date.
Resumo:
sermon text; MS Word document
Resumo:
snBench is a platform on which novice users compose and deploy distributed Sense and Respond programs for simultaneous execution on a shared, distributed infrastructure. It is a natural imperative that we have the ability to (1) verify the safety/correctness of newly submitted tasks and (2) derive the resource requirements for these tasks such that correct allocation may occur. To achieve these goals we have established a multi-dimensional sized type system for our functional-style Domain Specific Language (DSL) called Sensor Task Execution Plan (STEP). In such a type system data types are annotated with a vector of size attributes (e.g., upper and lower size bounds). Tracking multiple size aspects proves essential in a system in which Images are manipulated as a first class data type, as image manipulation functions may have specific minimum and/or maximum resolution restrictions on the input they can correctly process. Through static analysis of STEP instances we not only verify basic type safety and establish upper computational resource bounds (i.e., time and space), but we also derive and solve data and resource sizing constraints (e.g., Image resolution, camera capabilities) from the implicit constraints embedded in program instances. In fact, the static methods presented here have benefit beyond their application to Image data, and may be extended to other data types that require tracking multiple dimensions (e.g., image "quality", video frame-rate or aspect ratio, audio sampling rate). In this paper we present the syntax and semantics of our functional language, our type system that builds costs and resource/data constraints, and (through both formalism and specific details of our implementation) provide concrete examples of how the constraints and sizing information are used in practice.
Resumo:
It is shown how the Debye rotational diffusion model of dielectric relaxation of polar molecules (which may be described in microscopic fashion as the diffusion limit of a discrete time random walk on the surface of the unit sphere) may be extended to yield the empirical Havriliak-Negami (HN) equation of anomalous dielectric relaxation from a microscopic model based on a kinetic equation just as in the Debye model. This kinetic equation is obtained by means of a generalization of the noninertial Fokker-Planck equation of conventional Brownian motion (generally known as the Smoluchowski equation) to fractional kinetics governed by the HN relaxation mechanism. For the simple case of noninteracting dipoles it may be solved by Fourier transform techniques to yield the Green function and the complex dielectric susceptibility corresponding to the HN anomalous relaxation mechanism.