941 resultados para environmental technology
Resumo:
We characterize asymmetric equilibria in two-stage process innovation games and show that they are prevalent in the different models of R&D technology considered in the literature. Indeed, cooperation in R&D may be accompanied by high concentration in the product market. We show that while such an increase may be profitable, it may be socially inefficient.
Resumo:
Complex glycoprotein biopharmaceuticals, such as follicle stimulating hormone (FSH), erythropoietin and tissue plasminogen activator consist of a range of charge isoforms due to the extent of sialic acid capping of the glycoprotein glycans. Sialic acid occupies the terminal position on the oligosaccharide chain, masking the penultimate sugar residue, galactose from recognition and uptake by the hepatocyte asialoglycoprotein receptor. It is therefore well established that the more acidic charge isoforms of glycoprotein biopharmaceuticals have higher in vivo potencies than those of less acidic isoforms due to their longer serum half-life. Current strategies for manipulating glycoprotein charge isoform profile involve cell engineering or altering bioprocesss parameters to optimise expression of more acidic or basic isoforms, rather than downstream separation of isoforms. A method for the purification of a discrete range of bioactive recombinant human FSH (rhFSH) charge isoforms based on Gradiflow(TM) preparative electrophoresis technology is described. Gradiflow(TM) electrophoresis is scaleable, and incorporation into glycoprotein biopharmaceutical production bioprocesses as a potential final step facilitates the production of biopharmaceutical preparations of improved in vivo potency. (C) 2005 Elsevier B.V. All rights reserved.
Resumo:
A simple framework was used to analyse the determinants of potential yield of sunflower (Helianthus annuus L.) in a subtropical environment. The aim was to investigate the stability of the determinants crop duration, canopy light interception, radiation use efficiency (RUE), and harvest index (HI) at 2 sowing times and with 3 genotypes differing in crop maturity and stature. Crop growth, phenology, light interception, yield, prevailing temperature, and radiation were recorded and measured throughout the crop cycle. Significant differences in grain yield were found between the 2 sowings, but not among genotypes within each sowing. Mean yields (0% moisture) were 6 . 02 and 2 . 17 t/ha for the first sowing, on 13 September (S1), and the second sowing, on 5 March (S2), respectively. Exceptionally high yields in S1 were due to high biomass assimilation associated with the high radiation environment, high light interception owing to a greater leaf area index, and high RUE (1 . 47-1 . 62 g/MJ) across genotypes. It is proposed that the high RUE was caused by high levels of available nitrogen maintained during crop growth by frequent applications of fertiliser and sewage effluent as irrigation. In addition to differences in the radiation environment, the assimilate partitioned to grain was reduced in S2 associated with a reduction in the duration of grain-filling. Harvest index was 0 . 40 in S1 and 0 . 25 in S2. It is hypothesised that low minimum temperatures experienced in S2 reduced assimilate production and partitioning, causing premature maturation.
Resumo:
The absence of considerations of technology in policy studies reinforces the popular notion that technology is a neutral tool, Through an analysis of the role played by computers in the policy processes of Australia's Department of Social Security, this paper argues that computers are political players in policy processes, Findings indicate that computers make aspects of the social domain knowable and therefore governable, The use of computers makes previously infeasible policies possible, Computers also operate as bureaucrats and as agents of client surveillance. Increased policy change, reduced discretion and increasingly targeted and complex policies can be attributed to the use of computer technology, If policy processes are to be adequately understood and analysed, then the role of technology in those processes must be considered.
Resumo:
Radiation dose calculations in nuclear medicine depend on quantification of activity via planar and/or tomographic imaging methods. However, both methods have inherent limitations, and the accuracy of activity estimates varies with object size, background levels, and other variables. The goal of this study was to evaluate the limitations of quantitative imaging with planar and single photon emission computed tomography (SPECT) approaches, with a focus on activity quantification for use in calculating absorbed dose estimates for normal organs and tumors. To do this we studied a series of phantoms of varying complexity of geometry, with three radionuclides whose decay schemes varied from simple to complex. Four aqueous concentrations of (99m)Tc, (131)I, and (111)In (74, 185, 370, and 740 kBq mL(-1)) were placed in spheres of four different sizes in a water-filled phantom, with three different levels of activity in the surrounding water. Planar and SPECT images of the phantoms were obtained on a modern SPECT/computed tomography (CT) system. These radionuclides and concentration/background studies were repeated using a cardiac phantom and a modified torso phantom with liver and ""tumor"" regions containing the radionuclide concentrations and with the same varying background levels. Planar quantification was performed using the geometric mean approach, with attenuation correction (AC), and with and without scatter corrections (SC and NSC). SPECT images were reconstructed using attenuation maps (AM) for AC; scatter windows were used to perform SC during image reconstruction. For spherical sources with corrected data, good accuracy was observed (generally within +/- 10% of known values) for the largest sphere (11.5 mL) and for both planar and SPECT methods with (99m)Tc and (131)I, but were poorest and deviated from known values for smaller objects, most notably for (111)In. SPECT quantification was affected by the partial volume effect in smaller objects and generally showed larger errors than the planar results in these cases for all radionuclides. For the cardiac phantom, results were the most accurate of all of the experiments for all radionuclides. Background subtraction was an important factor influencing these results. The contribution of scattered photons was important in quantification with (131)I; if scatter was not accounted for, activity tended to be overestimated using planar quantification methods. For the torso phantom experiments, results show a clear underestimation of activity when compared to previous experiment with spherical sources for all radionuclides. Despite some variations that were observed as the level of background increased, the SPECT results were more consistent across different activity concentrations. Planar or SPECT quantification on state-of-the-art gamma cameras with appropriate quantitative processing can provide accuracies of better than 10% for large objects and modest target-to-background concentrations; however when smaller objects are used, in the presence of higher background, and for nuclides with more complex decay schemes, SPECT quantification methods generally produce better results. Health Phys. 99(5):688-701; 2010
Resumo:
Gamma and beta radiation emitting radiopharmaceuticals are handled in nuclear medicine services, and in many cases there is only individual monitoring of gamma radiation. In this paper, the results obtained using a wrist dosimeter prototype (CaSO(4):Dy + Teflon pellets) show that the doses for workers occupationally exposed to beta radiation from (153)Sm are not negligible. It is important that this dose is evaluated, and it has to be taken into consideration in the individual monitoring system.
Resumo:
This study examined the impact of computer and assistive device use on the employment status and vocational modes of people with physical disabilities in Australia. A survey was distributed to people over 15 years in age with physical disabilities living in the Brisbane area. Responses were received from 82 people, including those with spinal cord injuries, cerebral palsy and muscular dystrophy. Of respondents 46 were employed, 22 were unemployed, and 12 were either students or undertaking voluntary work. Three-quarters of respondents used a computer in their occupations, while 15 used assistive devices. Using logistic regression analysis it was found that gender, education, level of computer skill and computer training were significant predictors of employment outcomes. Neither the age of respondent nor use of assistive software were significant predictors. From information obtained in this study guidelines for a training programme designed to maximize the employability of people with physical disabilities were developed.
Resumo:
Recent advances in computer technology have made it possible to create virtual plants by simulating the details of structural development of individual plants. Software has been developed that processes plant models expressed in a special purpose mini-language based on the Lindenmayer system formalism. These models can be extended from their architectural basis to capture plant physiology by integrating them with crop models, which estimate biomass production as a consequence of environmental inputs. Through this process, virtual plants will gain the ability to react to broad environmental conditions, while crop models will gain a visualisation component. This integration requires the resolution of the fundamentally different time scales underlying the approaches. Architectural models are usually based on physiological time; each time step encompasses the same amount of development in the plant, without regard to the passage of real time. In contrast, physiological models are based in real time; the amount of development in a time step is dependent on environmental conditions during the period. This paper provides a background on the plant modelling language, then describes how widely-used concepts of thermal time can be implemented to resolve these time scale differences. The process is illustrated using a case study. (C) 1997 Elsevier Science Ltd.
Resumo:
This paper reports on measurements of crack growth by environmental assisted fracture (EAF) for 4340 steel in water and in air at various relative humidities. Of most interest is the observation of slow crack propagation in dry air. Fractographic analysis leads to the strong suggestion that this slow crack propagation is due to hydrogen cracking caused by internal hydrogen in solid solution inside the sample material.
Resumo:
Background The development of products and services for health care systems is one of the most important phenomena to have occurred in the field of health care over the last 50 years. It generates significant commercial, medical and social results. Although much has been done to understand how health technologies are adopted and regulated in developed countries, little attention has been paid to the situation in low- and middle-income countries (LMICs). Here we examine the institutional environment in which decisions are made regarding the adoption of expensive medical devices into the Brazilian health care system. Methods We used a case study strategy to address our research question. The empirical work relied on in-depth interviews (N = 16) with representatives of a wide range of actors and stakeholders that participate in the process of diffusion of CT (computerized tomography) scanners in Brazil, including manufacturers, health care organizations, medical specialty societies, health insurance companies, regulatory agencies and the Ministry of Health. Results The adoption of CT scanners is not determined by health policy makers or third-party payers of public and private sectors. Instead, decisions are primarily made by administrators of individual hospitals and clinics, strongly influenced by both physicians and sales representatives of the medical industry who act as change agents. Because this process is not properly regulated by public authorities, health care organizations are free to decide whether, when and how they will adopt a particular technology. Conclusions Our study identifies problems in how health care systems in LMICs adopt new, expensive medical technologies, and suggests that a set of innovative approaches and policy instruments are needed in order to balance the institutional and professional desire to practise a modern and expensive medicine in a context of health inequalities and basic health needs.
Concepts and determination of reference values for human biomonitoring of environmental contaminants
Resumo:
Human biomonitoring (HBM) of environmental contaminants plays an important role in estimating exposure and evaluating risk, and thus it has been increasingly applied in the environmental field. The results of HBM must be compared with reference values ( RV). The term ""reference values"" has always been related to the interpretation of clinical laboratory tests. For physicians, RV indicate ""normal values"" or ""limits of normal""; in turn, toxicologists prefer the terms ""background values"" or ""baseline values"" to refer to the presence of contaminants in biological fluids. This discrepancy leads to the discussion concerning which should be the population selected to determine RV. Whereas clinical chemistry employs an altered health state as the main exclusion criterion to select a reference population ( that is, a ""healthy"" population would be selected), in environmental toxicology the exclusion criterion is the abnormal exposure to xenobiotics. Therefore, the choice of population to determine RV is based on the very purpose of the RV to be determined. The present paper discusses the concepts and methodology used to determine RV for biomarkers of chemical environmental contaminants.
Resumo:
Little is known about the effect of clinical characteristics, parental psychopathology, family functioning, and environmental stressors in the response to methylphenidate in children with attention-deficit/hyperactivity disorder (ADHD) followed up in a naturalistic setting. Data from cultures outside the United States are extremely scarce. This is a longitudinal study using a nonrandom assignment, quasi-experimental design. One hundred twenty-five children with ADHD were treated with methylphenidate according to standard clinical procedures, and followed up for 6 months. The severity of ADHD symptoms was assessed by the Swanson, Nolan, and Pelham rating scale. In the final multivariate model, ADHD combined subtype (P < 0.001) and comorbidity with oppositional defiant disorder (P = 0.03) were both predictors of a worse clinical response. In addition, the levels of maternal ADHD symptoms were also associated with worse prognosis (P < 0.001). In the context of several adverse psychosocial factors assessed, only undesired pregnancy was associated with poorer response to methylphenidate in the final comprehensive-model (P = 0.02). Our study provides evidence for the involvement of clinical characteristics, maternal psychopathology, and environmental stressors in the response to methylphenidate. Clinicians may consider adjuvant strategies when negative predictors are present to increase the chances of success with methylphenidate treatment.