965 resultados para Layout (Printing)
Resumo:
Neuronal oscillations are thought to underlie interactions between distinct brain regions required for normal memory functioning. This study aimed at elucidating the neuronal basis of memory abnormalities in neurodegenerative disorders. Magnetoencephalography (MEG) was used to measure oscillatory brain signals in patients with Alzheimer s disease (AD), a neurodegenerative disease causing progressive cognitive decline, and mild cognitive impairment (MCI), a disorder characterized by mild but clinically significant complaints of memory loss without apparent impairment in other cognitive domains. Furthermore, to help interpret our AD/MCI results and to develop more powerful oscillatory MEG paradigms for clinical memory studies, oscillatory neuronal activity underlying declarative memory, the function which is afflicted first in both AD and MCI, was investigated in a group of healthy subjects. An increased temporal-lobe contribution coinciding with parieto-occipital deficits in oscillatory activity was observed in AD patients: sources in the 6 12.5 Hz range were significantly stronger in the parieto-occipital and significantly weaker in the right temporal region in AD patients, as compared to MCI patients and healthy elderly subjects. Further, the auditory steady-state response, thought to represent both evoked and induced activity, was enhanced in AD patients, as compared to controls, possibly reflecting decreased inhibition in auditory processing and deficits in adaptation to repetitive stimulation with low relevance. Finally, the methodological study revealed that successful declarative encoding and retrieval is associated with increases in occipital gamma and right hemisphere theta power in healthy unmedicated subjects. This result suggests that investigation of neuronal oscillations during cognitive performance could potentially be used to investigate declarative memory deficits in AD patients. Taken together, the present results provide an insight on the role of brain oscillatory activity in memory function and memory disorders.
Resumo:
The results from laboratory model tests and numerical simulations on square footings resting on sand are presented. Bearing capacity of footings on geosynthetic reinforced sand is evaluated and the effect of various reinforcement parameters like the type and tensile strength of geosynthetic material, amount of reinforcement, layout and configuration of geosynthetic layers below the footing on the bearing capacity improvement of the footings is studied through systemati model studies. A steel tank of size 900 x 900 x 600 mm is used for conducting model tests. Four types of grids, namely strong biaxial geogrid, weak biaxial geogrid, uniaxial geogrid and a geonet, each with different tensile strength, are used in the tests. Geosynthetic reinforcement is provided in the form of planar layers, varying the depth of reinforced zone below the footing, number of geosynthetic layers within the reinforced zone and the width of geosynthetic layers in different tests. Influence of all these parameters on the bearing capacity improvement of square footing and its settlement is studied by comparing with the test on unreinforced sand. Results show that the effective depth of reinforcement is twice the width of the footing and optimum spacing of geosynthetic layers is half the width of the footing. It is observed that the layout and configuration of reinforcement play a vital role in bearing capacity improvement rather than the tensile strength of the geosynthetic material. Experimental observations are supported by the findings from numerical analyses.
Resumo:
γ-aminobutyric acid (GABA) is the main inhibitory transmitter in the nervous system and acts via three distinct receptor classes: A, B, and C. GABAC receptors are ionotropic receptors comprising ρ subunits. In this work, we aimed to elucidate the expression of ρ subunits in the postnatal brain, the characteristics of ρ2 homo-oligomeric receptors, and the function of GABAC receptors in the hippocampus. In situ hybridization on rat brain slices showed ρ2 mRNA expression from the newborn in the superficial grey layer of the superior colliculus, from the first postnatal week in the hippocampal CA1 region and the pretectal nucleus of the optic tract, and in the adult dorsal lateral geniculate nucleus. Quantitative RT-PCR revealed expression of all three ρ subunits in the hippocampus and superior colliculus from the first postnatal day. In the hippocampus, ρ2 mRNA expression clearly dominated over ρ1 and ρ3. GABAC receptor protein expression was confirmed in the adult hippocampus, superior colliculus, and dorsal lateral geniculate nucleus by immunohistochemistry. From the selective distribution of ρ subunits, GABAC receptors may be hypothesized to be specifically involved in aspects of visual image motion processing in the rat brain. Although previous data had indicated a much higher expression level for ρ2 subunit transcripts than for ρ1 or ρ3 in the brain, previous work done on Xenopus oocytes had suggested that rat ρ2 subunits do not form functional homo-oligomeric GABAC receptors but need ρ1 or ρ3 subunits to form hetero-oligomers. Our results demonstrated, for the first time, that HEK 293 cells transfected with ρ2 cDNA displayed currents in whole-cell patch-clamp recordings. Homomeric rat ρ2 receptors had a decreased sensitivity to, but a high affinity for picrotoxin and a marked sensitivity to the GABAC receptor agonist CACA. Our results suggest that ρ2 subunits may contribute to brain function, also in areas not expressing other ρ subunits. Using extracellular electrophysiological recordings, we aimed to study the effects of the GABAC receptor agonists and antagonists on responses of the hippocampal neurons to electrical stimulation. Activation of GABAC receptors with CACA suppressed postsynaptic excitability and the GABAC receptor antagonist TPMPA inhibited the effects of CACA. Next, we aimed to display the activation of the GABAC receptors by synaptically released GABA using intracellular recordings. GABA-mediated long-lasting depolarizing responses evoked by high-frequency stimulation were prolonged by TPMPA. For weaker stimulation, the effect of TPMPA was enhanced after GABA uptake was inhibited. Our data demonstrate that GABAC receptors can be activated by endogenous synaptic transmitter release following strong stimulation or under conditions of reduced GABA uptake. The lack of GABAC receptor activation by less intensive stimulation under control conditions suggests that these receptors are extrasynaptic and activated via spillover of synaptically released GABA. Taken together with the restricted expression pattern of GABAC receptors in the brain and their distinctive pharmacological and biophysical properties, our findings supporting extrasynaptic localization of these receptors raise interesting possibilities for novel pharmacological therapies in the treatment of, for example, epilepsy and sleep disorders.
Resumo:
The StreamIt programming model has been proposed to exploit parallelism in streaming applications oil general purpose multicore architectures. The StreamIt graphs describe task, data and pipeline parallelism which can be exploited on accelerators such as Graphics Processing Units (GPUs) or CellBE which support abundant parallelism in hardware. In this paper, we describe a novel method to orchestrate the execution of if StreamIt program oil a multicore platform equipped with an accelerator. The proposed approach identifies, using profiling, the relative benefits of executing a task oil the superscalar CPU cores and the accelerator. We formulate the problem of partitioning the work between the CPU cores and the GPU, taking into account the latencies for data transfers and the required buffer layout transformations associated with the partitioning, as all integrated Integer Linear Program (ILP) which can then be solved by an ILP solver. We also propose an efficient heuristic algorithm for the work-partitioning between the CPU and the GPU, which provides solutions which are within 9.05% of the optimal solution on an average across the benchmark Suite. The partitioned tasks are then software pipelined to execute oil the multiple CPU cores and the Streaming Multiprocessors (SMs) of the GPU. The software pipelining algorithm orchestrates the execution between CPU cores and the GPU by emitting the code for the CPU and the GPU, and the code for the required data transfers. Our experiments on a platform with 8 CPU cores and a GeForce 8800 GTS 512 GPU show a geometric mean speedup of 6.94X with it maximum of 51.96X over it single threaded CPU execution across the StreamIt benchmarks. This is a 18.9% improvement over it partitioning strategy that maps only the filters that cannot be executed oil the GPU - the filters with state that is persistent across firings - onto the CPU.
Resumo:
This thesis utilises an evidence-based approach to critically evaluate and summarize effectiveness research on physiotherapy, physiotherapy-related motor-based interventions and orthotic devices in children and adolescents with cerebral palsy (CP). It aims to assess the methodological challenges of the systematic reviews and trials, to evaluate the effectiveness of interventions in current use, and to make suggestions for future trials Methods: Systematic reviews were searched from computerized bibliographic databases up to August 2007 for physiotherapy and physiotherapy-related interventions, and up to May 2003 for orthotic devices. Two reviewers independently identified, selected, and assessed the quality of the reviews using the Overview Quality Assessment Questionnaire complemented with decision rules. From a sample of 14 randomized controlled trials (RCT) published between January 1990 and June 2003 we analysed the methods of sampling, recruitment, and comparability of groups; defined the components of a complex intervention; identified outcome measures based on the International Classification of Functioning, Disability and Health (ICF); analysed the clinical interpretation of score changes; and analysed trial reporting using a modified 33-item CONSORT (Consolidated Standards of Reporting Trials) checklist. The effectiveness of physiotherapy and physiotherapy-related interventions in children with diagnosed CP was evaluated in a systematic review of randomised controlled trials that were searched from computerized databases from January 1990 up to February 2007. Two reviewers independently assessed the methodological quality, extracted the data, classified the outcomes using the ICF, and considered the level of evidence according to van Tulder et al. (2003). Results: We identified 21 reviews on physiotherapy and physiotherapy-related interventions and five on orthotic devices. These reviews summarized 23 or 5 randomised controlled trials and 104 or 27 observational studies, respectively. Only six reviews were of high quality. These found some evidence supporting strength training, constraint-induced movement therapy or hippotherapy, and insufficient evidence on comprehensive interventions. Based on the original studies included in the reviews on orthotic devices we found some short-term effects of lower limb casting on passive range of movement, and of ankle-foot orthoses on equinus walk. Long term effects of lower limb orthoses have not been studied. Evidence of upper limb casting or orthoses is conflicting. In the sample of 14 RCTs, most trials used simple randomisation, complemented with matching or stratification, but only three specified the concealed allocation. Numerous studies provided sufficient details on the components of a complex intervention, but the overlap of outcome measures across studies was poor and the clinical interpretation of observed score changes was mostly missing. Almost half (48%) of the applicable CONSORT-based items (range 28 32) were reported adequately. Most reporting inadequacies were in outcome measures, sample size determination, details of the sequence generation, allocation concealment and implementation of the randomization, success of assessor blinding, recruitment and follow-up dates, intention-to-treat analysis, precision of the effect size, co-interventions, and adverse events. The systematic review identified 22 trials on eight intervention categories. Four trials were of high quality. Moderate evidence of effectiveness was established for upper extremity treatments on attained goals, active supination and developmental status, and of constraint-induced therapy on the amount and quality of hand use and new emerging behaviours. Moderate evidence of ineffectiveness was found for strength training's effect on walking speed and stride length. Conflicting evidence was found for strength training's effect on gross motor function. For the other intervention categories the evidence was limited due to the low methodological quality and the statistically insignificant results of the studies. Conclusions: The high-quality reviews provide both supportive and insufficient evidence on some physiotherapy interventions. The poor quality of most reviews calls for caution, although most reviews drew no conclusions on effectiveness due to the poor quality of the primary studies. A considerable number of RCTs of good to fair methodological and reporting quality indicate that informative and well-reported RCTs on complex interventions in children and adolescents with CP are feasible. Nevertheless, methodological improvement is needed in certain areas of the trial design and performance, and the trial authors are encouraged to follow the CONSORT criteria. Based on RCTs we established moderate evidence for some effectiveness of upper extremity training. Due to limitations in methodological quality and variations in population, interventions and outcomes, mostly limited evidence on the effectiveness of most physiotherapy interventions is available to guide clinical practice. Well-designed trials are needed, especially for focused physiotherapy interventions.
Resumo:
Aims: The aims of this study were 1) to identify and describe health economic studies that have used quality-adjusted life years (QALYs) based on actual measurements of patients' health-related quality of life (HRQoL); 2) to test the feasibility of routine collection of health-related quality of life (HRQoL) data as an indicator of effectiveness of secondary health care; and 3) to establish and compare the cost-utility of three large-volume surgical procedures in a real-world setting in the Helsinki University Central Hospital, a large referral hospital providing secondary and tertiary health-care services for a population of approximately 1.4 million. Patients and methods: So as to identify studies that have used QALYs as an outcome measure, a systematic search of the literature was performed using the Medline, Embase, CINAHL, SCI and Cochrane Library electronic databases. Initial screening of the identified articles involved two reviewers independently reading the abstracts; the full-text articles were also evaluated independently by two reviewers, with a third reviewer used in cases where the two reviewers could not agree a consensus on which articles should be included. The feasibility of routinely evaluating the cost-effectiveness of secondary health care was tested by setting up a system for collecting HRQoL data on approximately 4 900 patients' HRQoL before and after operative treatments performed in the hospital. The HRQoL data used as an indicator of treatment effectiveness was combined with diagnostic and financial indicators routinely collected in the hospital. To compare the cost-effectiveness of three surgical interventions, 712 patients admitted for routine operative treatment completed the 15D HRQoL questionnaire before and also 3-12 months after the operation. QALYs were calculated using the obtained utility data and expected remaining life years of the patients. Direct hospital costs were obtained from the clinical patient administration database of the hospital and a cost-utility analysis was performed from the perspective of the provider of secondary health care services. Main results: The systematic review (Study I) showed that although QALYs gained are considered an important measure of the effectiveness of health care, the number of studies in which QALYs are based on actual measurements of patients' HRQoL is still fairly limited. Of the reviewed full-text articles, only 70 reported QALYs based on actual before after measurements using a valid HRQoL instrument. Collection of simple cost-effectiveness data in secondary health care is feasible and could easily be expanded and performed on a routine basis (Study II). It allows meaningful comparisons between various treatments and provides a means for allocating limited health care resources. The cost per QALY gained was 2 770 for cervical operations and 1 740 for lumbar operations. In cases where surgery was delayed the cost per QALY was doubled (Study III). The cost per QALY ranges between subgroups in cataract surgery (Study IV). The cost per QALY gained was 5 130 for patients having both eyes operated on and 8 210 for patients with only one eye operated on during the 6-month follow-up. In patients whose first eye had been operated on previous to the study period, the mean HRQoL deteriorated after surgery, thus precluding the establishment of the cost per QALY. In arthroplasty patients (Study V) the mean cost per QALY gained in a one-year period was 6 710 for primary hip replacement, 52 270 for revision hip replacement, and 14 000 for primary knee replacement. Conclusions: Although the importance of cost-utility analyses has during recent years been stressed, there are only a limited number of studies in which the evaluation is based on patients own assessment of the treatment effectiveness. Most of the cost-effectiveness and cost-utility analyses are based on modeling that employs expert opinion regarding the outcome of treatment, not on patient-derived assessments. Routine collection of effectiveness information from patients entering treatment in secondary health care turned out to be easy enough and did not, for instance, require additional personnel on the wards in which the study was executed. The mean patient response rate was more than 70 %, suggesting that patients were happy to participate and appreciated the fact that the hospital showed an interest in their well-being even after the actual treatment episode had ended. Spinal surgery leads to a statistically significant and clinically important improvement in HRQoL. The cost per QALY gained was reasonable, at less than half of that observed for instance for hip replacement surgery. However, prolonged waiting for an operation approximately doubled the cost per QALY gained from the surgical intervention. The mean utility gain following routine cataract surgery in a real world setting was relatively small and confined mostly to patients who had had both eyes operated on. The cost of cataract surgery per QALY gained was higher than previously reported and was associated with considerable degree of uncertainty. Hip and knee replacement both improve HRQoL. The cost per QALY gained from knee replacement is two-fold compared to hip replacement. Cost-utility results from the three studied specialties showed that there is great variation in the cost-utility of surgical interventions performed in a real-world setting even when only common, widely accepted interventions are considered. However, the cost per QALY of all the studied interventions, except for revision hip arthroplasty, was well below 50 000, this figure being sometimes cited in the literature as a threshold level for the cost-effectiveness of an intervention. Based on the present study it may be concluded that routine evaluation of the cost-utility of secondary health care is feasible and produces information essential for a rational and balanced allocation of scarce health care resources.
Resumo:
A channel router is an important design aid in the design automation of VLSI circuit layout. Many algorithms have been developed based on various wiring models with routing done on two layers. With the recent advances in VLSI process technology, it is possible to have three independent layers for interconnection. In this paper two algorithms are presented for three-layer channel routing. The first assumes a very simple wiring model. This enables the routing problem to be solved optimally in a time of O(n log n). The second algorithm is for a different wiring model and has an upper bound of O(n2) for its execution time. It uses fewer horizontal tracks than the first algorithm. For the second model the channel width is not bounded by the channel density.
Resumo:
Infection by Epstein-Barr virus (EBV) occurs in approximately 95% of the world s population. EBV was the first human virus implicated in oncogenesis. Characteristic for EBV primary infection are detectable IgM and IgG antibodies against viral capsid antigen (VCA). During convalescence the VCA IgM disappears while the VCA IgG persists for life. Reactivations of EBV occur both among immunocompromised and immunocompetent individuals. In serological diagnosis, measurement of avidity of VCA IgG separates primary from secondary infections. However, in serodiagnosis of mononucleosis it is quite common to encounter, paradoxically, VCA IgM together with high-avidity VCA IgG, indicating past immunity. We determined the etiology of this phenomenon and found that, among patients with cytomegalovirus (CMV) primary infection a large proportion (23%) showed antibody profiles of EBV reactivation. In contrast, EBV primary infection did not appear to induce immunoreactivation of CMV. EBV-associated post-transplant lymphoproliferative disease (PTLD) is a life threatening complication of allogeneic stem cell or solid organ transplantation. PTLD may present with a diverse spectrum of clinical symptoms and signs. Due to rapidity of PTLD progression especially after stem cell transplantation, the diagnosis must be obtained quickly. Pending timely detection, the evolution of the fatal disease may be halted by reduction of immunosuppression. A promising new PTLD treatment (also in Finland) is based on anti-CD-20 monoclonal antibodies. Diagnosis of PTLD has been demanding because of immunosuppression, blood transfusions and the latent nature of the virus. We set up in 1999 to our knowledge first in Finland for any microbial pathogen a real-time quantitative PCR (qPCR) for detection of EBV DNA in blood serum/plasma. In addition, we set up an in situ hybridisation assay for EBV RNA in tissue sections. In collaboration with a group of haematologists at Helsinki University Central Hospital we retrospectively determined the incidence of PTLD among 257 allogenic stem cell transplantations (SCT) performed during 1994-1999. Post-mortem analysis revealed 18 cases of PTLD. From a subset of PTLD cases (12/18) and a series of corresponding controls (36), consecutive samples of serum were studied by the new EBV-qPCR. All the PTLD patients were positive for EBV-DNA with progressively rising copy numbers. In most PTLD patients EBV DNA became detectable within 70 days of SCT. Of note, the appearance of EBV DNA preceded the PTLD symptoms (fever, lymphadenopathy, atypical lymphocytes). Among the SCT controls, EBV DNA occurred only sporadically, and the EBV-DNA levels remained relatively low. We concluded that EBV qPCR is a highly sensitive (100%) and specific (96%) new diagnostic approach. We also looked for and found risk factors for the development of PTLD. Together with a liver transplantation group at the Transplantation and Liver Surgery Clinic we wanted to clarify how often and how severely do EBV infections occur after liver transplantation. We studied by the EBV qPCR 1284 plasma samples obtained from 105 adult liver transplant recipients. EBV DNA was detected in 14 patients (13%) during the first 12 months. The peak viral loads of 13 asymptomatic patients were relatively low (<6600/ml), and EBV DNA subsided quickly from circulation. Fatal PTLD was diagnosed in one patient. Finally, we wanted to determine the number and clinical significance of EBV infections of various types occurring among a large, retrospective, nonselected cohort of allogenic SCT recipients. We analysed by EBV qPCR 5479 serum samples of 406 SCT recipients obtained during 1988-1999. EBV DNA was seen in 57 (14%) patients, of whom 22 (5%) showed progressively rising and ultimately high levels of EBV DNA (median 54 million /ml). Among the SCT survivors, EBV DNA was transiently detectable in 19 (5%) asymptomatic patients. Thereby, low-level EBV-DNA positivity in serum occurs relatively often after SCT and may subside without specific treatment. However, high molecular copy numbers (>50 000) are diagnostic for life-threatening EBV infection. We furthermore developed a mathematical algorithm for the prediction of development of life-threatening EBV infection.
Resumo:
This paper identifies two narratives of the Anthropocene and explores how they play out in the realm of future-looking fashion production. Each narrative draws on mythic comparisons to gods and monsters to express humanity’s dilemmas, albeit from different perspectives. The first is a Malthusian narrative of collapse and scarcity, brought about by the monstrous, unstoppable nature of human technology set loose on the natural world. In this vein, philosopher Slavoj Zizek (2010) draws on Biblical analogies, likening ecological crisis to one of the four horsemen of the apocalypse. To find a myth to suit the present times, novelist A.S Byatt (2011) proposes Ragnarök, a Norse myth in which the gods destroy themselves. In contrast, the second narrative is one of technological cornucopia. Stewart Brand (2009, 27), self-described ‘eco-pragmatist’ writes, ‘we are as gods and we have to get good at it’. In his view, human technologies offer the only hope to mitigating the problems caused by human technology – Brand suggests harnessing nuclear power, bioengineering of crops and the geoengineering of the planet as the way forward. Similarly, the French philosopher Bruno Latour (2012, 274), exhorts us to “love our monsters”, likening our technologies to Doctor Frankenstein’s monster – set loose upon the world, and then reviled by his creator. For both Brand and Latour, human technology may be monstrous, but it must also be turned toward solutions. Within this schema, hopeful visions of the future of fashion are similarly divided. In the techno-enabled cornucopian future, the fashion industry embraces wearable technology, speed and efficiency. Technologies such as waterless dyeing, 3D printing and self-cleaning garments shift fashion into a new era of cleaner production. Meanwhile, in the narrative of scarcity, a more cautious approach sees fashion return to a new localism and valuing of the hand-made in a time of shrinking resources. Through discussion of future-looking fashion designers, brands, and activists, this paper explores how they may align along a spectrum to one of these two grand narratives of the future. The paper will discuss how these narratives may unconsciously shape the perspective of both producers and users around the fashion of today and the fashion of tomorrow. This paper poses the question: what stories can be written for fashion’s future in the Anthropocene, and are they fated, or can they be re-written?
Resumo:
Metastatic kidney and breast cancer are devastating diseases currently lacking efficient treatment options. One promising developmental approach in cancer treatment are oncolytic adenoviruses, which have demonstrated excellent safety in many clinical trials. However, antitumor efficacy needs to be improved in order to make oncolytic viruses a viable treatment alternative. To be able to follow oncolytic virus replication in vivo, we set up a non-invasive imaging system based on coinjection of a replication deficient luciferase expressing virus and a replication competent virus. The system was validated in vitro and in vivo and used in other projects of the thesis. In another study we showed that capsid modifications on adenoviruses result in enhanced gene transfer and increased oncolytic effect on renal cancer cells in vitro. Moreover, capsid modified oncolytic adenoviruses demonstrated significantly improved antitumor efficacy in murine kidney cancer models. To transcriptionally target kidney cancer tissue we evaluated two hypoxia response elements for their usability as tissue specific promoters using a novel dual luciferase imaging system. Based on the results of the promoter evaluation and the studies on capsid modifications, we constructed a transcriptionally and transductionally targeted oncolytic adenovirus armed with an antiangiogenic transgene for enhanced renal cell cancer specificity and improved antitumor efficacy. This virus exhibited kidney cancer specific replication and significantly improved antitumor effect in a murine model of intraperitoneal disseminated renal cell cancer. Cancer stem cells are thought to be resistant to conventional cancer drugs and might play an important role in breast cancer relapse and the formation of metastasis. Therefore, we examined if capsid modified oncolytic adenoviruses are able to kill these cells proposed to be breast cancer initiating. Efficient oncolytic effect and significant antitumor efficacy on tumors established with breast cancer initiating cells was observed, suggesting that oncolytic adenoviruses might be able to prevent breast cancer relapse and could be used in the treatment of metastatic disease. In conclusion, the results presented in this thesis suggest that genetically engineered oncolytic adenoviruses have great potential in the treatment of metastatic kidney and breast cancer.
Variation in tracheid cross-sectional dimensions and wood viscoelasticity extent and control methods
Resumo:
Printing papers have been the main product of the Finnish paper industry. To improve properties and economy of printing papers, controlling of tracheid cross-sectional dimensions and wood viscoelasticity are examined in this study. Controlling is understood as any procedure which yields raw material classes with distinct properties and small internal variation. Tracheid cross-sectional dimensions, i.e., cell wall thickness and radial and tangential diameters can be controlled with methods such as sorting wood into pulpwood and sawmill chips, sorting of logs according to tree social status and fractionation of fibres. These control methods were analysed in this study with simulations, which were based on measured tracheid cross-sectional dimensions. A SilviScan device was used to measure the data set from five Norway spruce (Picea abies) and five Scots pine (Pinus sylvestris) trunks. The simulation results indicate that the sawmill chips and top pulpwood assortments have quite similar cross-sectional dimensions. Norway spruce and Scots pine are on average also relatively similar in their cross-sectional dimensions. The distributions of these species are somewhat different, but from a practical point of view, the differences are probably of minor importance. The controlling of tracheid cross-sectional dimensions can be done most efficiently with methods that can separate fibres into earlywood and latewood. Sorting of logs or partitioning of logs into juvenile and mature wood were markedly less efficient control methods than fractionation of fibres. Wood viscoelasticity affects energy consumption in mechanical pulping, and is thus an interesting control target when improving energy efficiency of the process. A literature study was made to evaluate the possibility of using viscoelasticity in controlling. The study indicates that there is considerable variation in viscoelastic properties within tree species, but unfortunately, the viscoelastic properties of important raw material lots such as top pulpwood or sawmill chips are not known. Viscoelastic properties of wood depend mainly on lignin, but also on microfibrillar angle, width of cellulose crystals and tracheid cross-sectional dimensions.
Resumo:
The availability and quality of irrigation water has become an issue limiting productivity in many Australian vegetable regions. Production is also under competitive pressure from supply chain forces. Producers look to new technologies, including changing irrigation infrastructure, exploring new water sources, and more complex irrigation management, to survive these stresses. Often there is little objective information investigating which improvements could improve outcomes for vegetable producers, and external communities (e.g. meeting NRM targets). This has led to investment in inappropriate technologies, and costly repetition of errors, as business independently discover the worth of technologies by personal experience. In our project, we investigated technology improvements for vegetable irrigation. Through engagement with industry and other researchers, we identified technologies most applicable to growers, particularly those that addressed priority issues. We developed analytical tools for ‘what if’ scenario testing of technologies. We conducted nine detailed experiments in the Lockyer Valley and Riverina vegetable growing districts, as well as case studies on grower properties in southern Queensland. We investigated root zone monitoring tools (FullStop™ wetting front detectors and Soil Solution Extraction Tubes - SSET), drip system layout, fertigation equipment, and altering planting arrangements. Our project team developed and validated models for broccoli, sweet corn, green beans and lettuce, and spreadsheets for evaluating economic risks associated with new technologies. We presented project outcomes at over 100 extension events, including irrigation showcases, conferences, field days, farm walks and workshops. The FullStops™ were excellent for monitoring root zone conditions (EC, nitrate levels), and managing irrigation with poor quality water. They were easier to interpret than the SSET. The SSET were simpler to install, but required wet soil to be reliable. SSET were an option for monitoring deeper soil zones, unsuitable for FullStop™ installations. Because these root zone tools require expertise, and are labour intensive, we recommend they be used to address specific problems, or as a periodic auditing strategy, not for routine monitoring. In our research, we routinely found high residual N in horticultural soils, with subsequently little crop yield response to additional nitrogen fertiliser. With improved irrigation efficiency (and less leaching), it may be timely to re-examine nitrogen budgets and recommendations for vegetable crops. Where the drip irrigation tube was located close to the crop row (i.e. within 5-8 cm), management of irrigation was easier. It improved nitrogen uptake, water use efficiency, and reduced the risk of poor crop performance through moisture stress, particularly in the early crop establishment phases. Close proximity of the drip tube to the crop row gives the producer more options for managing salty water, and more flexibility in taking risks with forecast rain. In many vegetable crops, proximate drip systems may not be cost-effective. The next best alternative is to push crop rows closer to the drip tube (leading to an asymmetric row structure). The vegetable crop models are good at predicting crop phenology (development stages, time to harvest), input use (water, fertiliser), environmental impacts (nutrient, salt movement) and total yields. The two immediate applications for the models are understanding/predicting/manipulating harvest dates and nitrogen movements in vegetable cropping systems. From the economic tools, the major influences on accumulated profit are price and yield. In doing ‘what if’ analyses, it is very important to be as accurate as possible in ascertaining what the assumed yield and price ranges are. In most vegetable production systems, lowering the required inputs (e.g. irrigation requirement, fertiliser requirement) is unlikely to have a major influence on accumulated profit. However, if a resource is constraining (e.g. available irrigation water), it is usually most profitable to maximise return per unit of that resource.
Resumo:
The layout of this second edition follows that of the first, though the content has been substantially rewritten to reflect 10 years of research and development, as well as the emergence of new pest species. Chapter 1 presents an overview, from a somewhat entomological perspective, of tropical forestry in its many guises. Chapters 2, 3 and 4 then discuss the 'pure' biology and ecology of tropical insects and their co-evolved relationships with the trees and forests in which they live. Chapter 5 is necessarily the largest chapter in the book, looking in detail at a selection of major pest species from all over the tropical world. Chapters 6, 7, 8 and 9 then discuss the theory and practice of insect pest management, starting at the fundamental planning stage, before any seeds hit the soil. Nursery management and stand management were considered in Chapters 7 and 8. Chapter 9 covers the topics of forest health surveillance, quarantine and forest invasive species, topics which again have significance at all stages of forestry but for convenience are presented after nursery and forest management. This, in fact, we attempt to do in the final chapter, Chapter 10, which combines most of the previous nine chapters in examples illustrating the concept of integrated pest management. ©CABI Publishing CABI Publishing
Resumo:
Genealogy of Sokolosky family reaching back to their Posen origins; emigration to New Orleans, Mississippi and Texas in 1860s; further family history in USA until 1990. Contains also preface by Rabbi Malcolm H. Stern, photographes of members of Sokolosky family, of gravestones and of family documents.
Resumo:
Asymmetrical flow field-flow fractionation (AsFlFFF) was constructed, and its applicability to industrial, biochemical, and pharmaceutical applications was studied. The effect of several parameters, such as pH, ionic strength, temperature and the reactants mixing ratios on the particle sizes, molar masses, and the formation of aggregates of macromolecules was determined by AsFlFFF. In the case of industrial application AsFlFFF proved to be a valuable tool in the characterization of the hydrodynamic particle sizes, molar masses and phase transition behavior of various poly(N-isopropylacrylamide) (PNIPAM) polymers as a function of viscosity and phase transition temperatures. The effect of sodium chloride salt and the molar ratio of cationic and anionic polyelectrolytes on the hydrodynamic particle sizes of poly (methacryloxyethyl trimethylammonium chloride) and poly (ethylene oxide)-block-poly (sodium methacrylate) and their complexes were studied. The particle sizes of PNIPAM polymers, and polyelectrolyte complexes measured by AsFlFFF were in agreement with those obtained by dynamic light scattering. The molar masses of PNIPAM polymers obtained by AsFlFFF and size exclusion chromatography agreed also well. In addition, AsFlFFF proved to be a practical technique in thermo responsive behavior studies of polymers at temperatures up to about 50 oC. The suitability of AsFlFFF for biological, biomedical, and pharmaceutical applications was proved, upon studying the lipid-protein/peptide interactions, and the stability of liposomes at different temperatures. AsFlFFF was applied to the studies on the hydrophobic and electrostatic interactions between cytochrome c (a basic peripheral protein) and anionic lipid, and oleic acid, and sodium dodecyl sulphate surfactant. A miniaturized AsFlFFF constructed in this study was exploited in the elucidation of the effect of copper (II), pH, ionic strength, and vortexing on the particle sizes of low-density lipoproteins.