990 resultados para display technology
Resumo:
Costs of purchasing new piglets and of feeding them until slaughter are the main variable expenditures in pig fattening. They both depend on slaughter intensity, the nature of feeding patterns and the technological constraints of pig fattening, such as genotype. Therefore, it is of interest to examine the effect of production technology and changes in input and output prices on feeding and slaughter decisions. This study examines the problem by using a dynamic programming model that links genetic characteristics of a pig to feeding decisions and the timing of slaughter and takes into account how these jointly affect the quality-adjusted value of a carcass. The model simulates the growth mechanism of a pig under optional feeding and slaughter patterns and then solves the optimal feeding and slaughter decisions recursively. The state of nature and the genotype of a pig are known in the analysis. The main contribution of this study is the dynamic approach that explicitly takes into account carcass quality while simultaneously optimising feeding and slaughter decisions. The method maximises the internal rate of return to the capacity unit. Hence, the results can have vital impact on competitiveness of pig production, which is known to be quite capital-intensive. The results suggest that producer can significantly benefit from improvements in the pig's genotype, because they improve efficiency of pig production. The annual benefits from obtaining pigs of improved genotype can be more than €20 per capacity unit. The annual net benefits of animal breeding to pig farms can also be considerable. Animals of improved genotype can reach optimal slaughter maturity quicker and produce leaner meat than animals of poor genotype. In order to fully utilise the benefits of animal breeding, the producer must adjust feeding and slaughter patterns on the basis of genotype. The results suggest that the producer can benefit from flexible feeding technology. The flexible feeding technology segregates pigs into groups according to their weight, carcass leanness, genotype and sex and thereafter optimises feeding and slaughter decisions separately for these groups. Typically, such a technology provides incentives to feed piglets with protein-rich feed such that the genetic potential to produce leaner meat is fully utilised. When the pig approaches slaughter maturity, the share of protein-rich feed in the diet gradually decreases and the amount of energy-rich feed increases. Generally, the optimal slaughter weight is within the weight range that pays the highest price per kilogram of pig meat. The optimal feeding pattern and the optimal timing of slaughter depend on price ratios. Particularly, an increase in the price of pig meat provides incentives to increase the growth rates up to the pig's biological maximum by increasing the amount of energy in the feed. Price changes and changes in slaughter premium can also have large income effects. Key words: barley, carcass composition, dynamic programming, feeding, genotypes, lean, pig fattening, precision agriculture, productivity, slaughter weight, soybeans
Resumo:
This chapter provides updated information on avocado fruit quality parameters, sensory perception and maturity, production and postharvest factors affecting quality defects, disinfestation and storage (including pre-conditioning), predicting outturn quality and processing.
Resumo:
Objectives In 2012, the National Institute for Health and Care Excellence assessed dasatinib, nilotinib, and standard-dose imatinib as first-line treatment of chronic phase chronic myelogenous leukemia (CML). Licensing of these alternative treatments was based on randomized controlled trials assessing complete cytogenetic response (CCyR) and major molecular response (MMR) at 12 months as primary end points. We use this case study to illustrate the validation of CCyR and MMR as surrogate outcomes for overall survival in CML and how this evidence was used to inform National Institute for Health and Care Excellence’s recommendation on the public funding of these first-line treatments for CML. Methods We undertook a systematic review and meta-analysis to quantify the association between CCyR and MMR at 12 months and overall survival in patients with chronic phase CML. We estimated life expectancy by extrapolating long-term survival from the weighted overall survival stratified according to the achievement of CCyR and MMR. Results Five studies provided data on the observational association between CCyR or MMR and overall survival. Based on the pooled association between CCyR and MMR and overall survival, our modeling showed comparable predicted mean duration of survival (21–23 years) following first-line treatment with imatinib, dasatinib, or nilotinib. Conclusions This case study illustrates the consideration of surrogate outcome evidence in health technology assessment. Although it is often recommended that the acceptance of surrogate outcomes be based on randomized controlled trial data demonstrating an association between the treatment effect on both the surrogate outcome and the final outcome, this case study shows that policymakers may be willing to accept a lower level of evidence (i.e., observational association).
Resumo:
Due to the recent development in CCD technology aerial photography is now slowly changing from film to digital cameras. This new aspect in remote sensing allows and requires also new automated analysis methods. Basic research on reflectance properties of natural targets is needed so that computerized processes could be fully utilized. For this reason an instrument was developed at Finnish Geodetic Institute for measurement of multiangular reflectance of small remote sensing targets e.g. forest understorey or asphalt. Finnish Geodetic Institute Field Goniospectrometer (FiGIFiGo) is a portable device that is operated by 1 or 2 persons. It can be reassembled to a new location in 15 minutes and after that a target's multiangular reflectance can be measured in 10 - 30 minutes (with one illumination angle). FiGIFiGo has effective spectral range approximately from 400 nm to 2000 nm. The measurements can be made either outside with sunlight or in laboratory with 1000 W QTH light source. In this thesis FiGIFiGo is introduced and the theoretical basis of such reflectance measurements are discussed. A new method is introduced for extraction of subcomponent proportions from reflectance of a mixture sample, e.g. for retrieving proportion of lingonberry's reflectance in observation of lingonberry-lichen sample. This method was tested by conducting a series of measurements on reflectance properties of artificial samples. The component separation method yielded sound results and brought up interesting aspects in targets' reflectances. The method and the results still need to be verified with further studies, but the preliminary results imply that this method could be a valuable tool in analysis of such mixture samples.
Resumo:
The Australian hardwood plantation industry is challenged to identify profitable markets for the sale of its wood fibre. The majority of the hardwood plantations already established in Australia have been managed for the production of pulpwood; however, interest exists to identify more profitable and value-added markets. As a consequence of a predominately pulpwood-focused management regime, this plantation resource contains a range of qualities and performance. Identifying alternative processing strategies and products that suit young plantation-grown hardwoods have proved challenging, with low product recoveries and/or unmarketable products as the outcome of many studies. Simple spindleless lathe technology was used to process 918 billets from six commercially important Australian hardwood species. The study has demonstrated that the production of rotary peeled veneer is an effective method for converting plantation hardwood trees. Recovery rates significantly higher than those reported for more traditional processing techniques (e.g., sawmilling) were achieved. Veneer visually graded to industry standards exhibited favourable recoveries suitable for the manufacture of structural products.
Resumo:
Past issues of Fibreculture have examined activist philosophies from angles such as social justice and networked organisational forms, communication rights and net neutrality debates, and the push back against precarious new media labour. Our issue extends this work by capturing the complexities associated with the use of technology in activist contexts, and offering insights into how practitioners, scholars, and the makers of digital and networked technologies do and might need to work more collaboratively and pragmatically to address social justice issues.
Resumo:
This paper discusses the use of observational video recordings to document young children’s use of technology in their homes. Although observational research practices have been used for decades, often with video-based techniques, the participant group in this study (i.e., very young children) and the setting (i.e., private homes), provide a rich space for exploring the benefits and limitations of qualitative observation. The data gathered in this study point to a number of key decisions and issues that researchers must face in designing observational research, particularly where non-researchers (in this case, parents) act as surrogates for the researcher at the data collection stage. The involvement of parents and children as research videographers in the home resulted in very rich and detailed data about children’s use of technology in their daily lives. However, limitations noted in the dataset (e.g., image quality) provide important guidance for researchers developing projects using similar methods in future. The paper provides recommendations for future observational designs in similar settings and/or with similar participant groups.
Resumo:
This paper details a workshop aimed at exploring opportunities for experience design through wearable art and design concepts. Specifically it explores the structure of the workshop with respect to facilitating learning through technology in the development of experiential wearable art and design. A case study titled Cloud Workshop: Wearables and Wellbeing; Enriching connections between citizens in the Asia-Pacific region was initiated through a cooperative partnership between Hong Kong Baptist University (HKBU), Queensland University of Technology (QUT) and Griffith University (GU). Digital technologies facilitated collaboration through an inter-disciplinary, inter-national and inter- cultural approach (Facer & Sandford, 2010) between Australia and Hong Kong. Students cooperated throughout a two-week period to develop innovative wearable concepts blending art, design and technology. An unpacking of the approach, pedagogical underpinning and final outcomes revealed distinct educational benefits as well as certain learning and technological challenges of the program. Qualitative feedback uncovered additional successes with respect to student engagement and enthusiasm, while uncovering shortcomings in the delivery and management of information and difficulties with cultural interactions. Potential future versions of the program aim to take advantage of the positives and overcome the limitations of the current pedagogical approach. It is hoped the case study will become a catalyst for future workshops that blur the boundaries of art, design and technology to uncover further benefits and potentials for new outcomes in experience design.
Resumo:
Aurizon, Australia's largest freight railway operator, is investigating the use of Rail Power Conditioner (RPC) technology for load balancing, reactive power compensation and harmonic filtering. The new technology has the capability of replacing Static VAr Compensators (SVC) and Harmonic Filters, and is expected to have a significant impact on the overall costs of railway electrification. This paper presents the theoretical analysis of the real and reactive power flows in an RPC used to balance active powers in an existing V/V feeder station. This informed an RPC feasibility study undertaken at four existing Aurizon's feeder stations with V/V connected transformers.
Resumo:
The concept of an atomic decomposition was introduced by Coifman and Rochberg (1980) for weighted Bergman spaces on the unit disk. By the Riemann mapping theorem, functions in every simply connected domain in the complex plane have an atomic decomposition. However, a decomposition resulting from a conformal mapping of the unit disk tends to be very implicit and often lacks a clear connection to the geometry of the domain that it has been mapped into. The lattice of points, where the atoms of the decomposition are evaluated, usually follows the geometry of the original domain, but after mapping the domain into another this connection is easily lost and the layout of points becomes seemingly random. In the first article we construct an atomic decomposition directly on a weighted Bergman space on a class of regulated, simply connected domains. The construction uses the geometric properties of the regulated domain, but does not explicitly involve any conformal Riemann map from the unit disk. It is known that the Bergman projection is not bounded on the space L-infinity of bounded measurable functions. Taskinen (2004) introduced the locally convex spaces LV-infinity consisting of measurable and HV-infinity of analytic functions on the unit disk with the latter being a closed subspace of the former. They have the property that the Bergman projection is continuous from LV-infinity onto HV-infinity and, in some sense, the space HV-infinity is the smallest possible substitute to the space H-infinity of analytic functions. In the second article we extend the above result to a smoothly bounded strictly pseudoconvex domain. Here the related reproducing kernels are usually not known explicitly, and thus the proof of continuity of the Bergman projection is based on generalised Forelli-Rudin estimates instead of integral representations. The minimality of the space LV-infinity is shown by using peaking functions first constructed by Bell (1981). Taskinen (2003) showed that on the unit disk the space HV-infinity admits an atomic decomposition. This result is generalised in the third article by constructing an atomic decomposition for the space HV-infinity on a smoothly bounded strictly pseudoconvex domain. In this case every function can be presented as a linear combination of atoms such that the coefficient sequence belongs to a suitable Köthe co-echelon space.
Resumo:
Microarrays are high throughput biological assays that allow the screening of thousands of genes for their expression. The main idea behind microarrays is to compute for each gene a unique signal that is directly proportional to the quantity of mRNA that was hybridized on the chip. A large number of steps and errors associated with each step make the generated expression signal noisy. As a result, microarray data need to be carefully pre-processed before their analysis can be assumed to lead to reliable and biologically relevant conclusions. This thesis focuses on developing methods for improving gene signal and further utilizing this improved signal for higher level analysis. To achieve this, first, approaches for designing microarray experiments using various optimality criteria, considering both biological and technical replicates, are described. A carefully designed experiment leads to signal with low noise, as the effect of unwanted variations is minimized and the precision of the estimates of the parameters of interest are maximized. Second, a system for improving the gene signal by using three scans at varying scanner sensitivities is developed. A novel Bayesian latent intensity model is then applied on these three sets of expression values, corresponding to the three scans, to estimate the suitably calibrated true signal of genes. Third, a novel image segmentation approach that segregates the fluorescent signal from the undesired noise is developed using an additional dye, SYBR green RNA II. This technique helped in identifying signal only with respect to the hybridized DNA, and signal corresponding to dust, scratch, spilling of dye, and other noises, are avoided. Fourth, an integrated statistical model is developed, where signal correction, systematic array effects, dye effects, and differential expression, are modelled jointly as opposed to a sequential application of several methods of analysis. The methods described in here have been tested only for cDNA microarrays, but can also, with some modifications, be applied to other high-throughput technologies. Keywords: High-throughput technology, microarray, cDNA, multiple scans, Bayesian hierarchical models, image analysis, experimental design, MCMC, WinBUGS.
Resumo:
Large-scale chromosome rearrangements such as copy number variants (CNVs) and inversions encompass a considerable proportion of the genetic variation between human individuals. In a number of cases, they have been closely linked with various inheritable diseases. Single-nucleotide polymorphisms (SNPs) are another large part of the genetic variance between individuals. They are also typically abundant and their measuring is straightforward and cheap. This thesis presents computational means of using SNPs to detect the presence of inversions and deletions, a particular variety of CNVs. Technically, the inversion-detection algorithm detects the suppressed recombination rate between inverted and non-inverted haplotype populations whereas the deletion-detection algorithm uses the EM-algorithm to estimate the haplotype frequencies of a window with and without a deletion haplotype. As a contribution to population biology, a coalescent simulator for simulating inversion polymorphisms has been developed. Coalescent simulation is a backward-in-time method of modelling population ancestry. Technically, the simulator also models multiple crossovers by using the Counting model as the chiasma interference model. Finally, this thesis includes an experimental section. The aforementioned methods were tested on synthetic data to evaluate their power and specificity. They were also applied to the HapMap Phase II and Phase III data sets, yielding a number of candidates for previously unknown inversions, deletions and also correctly detecting known such rearrangements.
Resumo:
Event-based systems are seen as good candidates for supporting distributed applications in dynamic and ubiquitous environments because they support decoupled and asynchronous many-to-many information dissemination. Event systems are widely used, because asynchronous messaging provides a flexible alternative to RPC (Remote Procedure Call). They are typically implemented using an overlay network of routers. A content-based router forwards event messages based on filters that are installed by subscribers and other routers. The filters are organized into a routing table in order to forward incoming events to proper subscribers and neighbouring routers. This thesis addresses the optimization of content-based routing tables organized using the covering relation and presents novel data structures and configurations for improving local and distributed operation. Data structures are needed for organizing filters into a routing table that supports efficient matching and runtime operation. We present novel results on dynamic filter merging and the integration of filter merging with content-based routing tables. In addition, the thesis examines the cost of client mobility using different protocols and routing topologies. We also present a new matching technique called temporal subspace matching. The technique combines two new features. The first feature, temporal operation, supports notifications, or content profiles, that persist in time. The second feature, subspace matching, allows more expressive semantics, because notifications may contain intervals and be defined as subspaces of the content space. We also present an application of temporal subspace matching pertaining to metadata-based continuous collection and object tracking.