842 resultados para Use of information
Resumo:
Fibre diameter can vary dramatically along a wool staple, especially in the Mediterranean environment of southern Australia with its dry summers and abundance of green feed in spring. Other research results have shown a very low phenotypic correlation between fibre diameter grown between seasons. Many breeders use short staples to measure fibre diameter for breeding purposes and also to promote animals for sale. The effectiveness of this practice is determined by the relative response to selection by measuring fibre traits on a full 12 months wool staple as compared to measuring them only on part of a staple. If a high genetic correlation exists between the part record and the full record, then using part records may be acceptable to identify genetically superior animals. No information is available on the effectiveness of part records. This paper investigated whether wool growth and fibre diameter traits of Merino wool grown at different times of the year in a Mediterranean environment, are genetically the same trait, respectively. The work was carried out on about 7 dyebanded wool sections/animal.year, on ewes from weaning to hogget age, in the Katanning Merino resource flocks over 6 years. Relative clean wool growth of the different sections had very low heritability estimates of less than 0.10, and they were phenotypically and genetically poorly correlated with 6 or 12 months wool growth. This indicates that part record measurement of clean wool growth of these sections will be ineffective as indirect selection criteria to improve wool growth genetically. Staple length growth as measured by the length between dyebands, would be more effective with heritability estimates of between 0.20 and 0.30. However, these measurements were shown to have a low genetic correlation with wool grown for 12 months which implies that these staple length measurements would only be half as efficient as the wool weight for 6 or 12 months to improve total clean wool weight. Heritability estimates of fibre diameter, coefficient of variation of fibre diameter and fibre curvature were relatively high and were genetically and phenotypically highly correlated across sections. High positive phenotypic and genetic correlations were also found between fibre diameter, coefficient of variation of fibre diameter and fibre curvature of the different sections and similar measurements for wool grown over 6 or 12 months. Coefficient of variation of fibre diameter of the sections also had a moderate negative phenotypic and genetic correlation with staple strength of wool staples grown over 6 months indicating that coefficient of variation of fibre diameter of any section would be as good an indirect selection criterion to improve stable strength as coefficient of variation of fibre diameter for wool grown over 6 or 12 months. The results indicate that fibre diameter, coefficient of variation of fibre diameter and fibre curvature of wool grown over short periods of time have virtually the same heritability as that of wool grown over 12 months, and that the genetic correlation between fibre diameter, coefficient of variation of fibre diameter and fibre curvature on part and on full records is very high (rg > 0.85). This indicates that fibre diameter, coefficient of variation of fibre diameter and fibre curvature on part records can be used as selection criteria to improve these traits. However, part records of greasy and clean wool growth would be much less efficient than fleece weight for wool grown over 6 or 12 months because of the low heritability of part records and the low genetic correlation between these traits on part records and on wool grown for 12 months.
Resumo:
We investigate the extent and nature of use of use of twitter for financial reporting by ASX listed companies. We consider 199 financial information related tweets from 14 ASX listed companies’ Twitter accounts. A thematic analysis of these tweets shows ‘Earnings’ and ‘Operational Performance’ are the most discussed financial reporting themes. Further, a comparison across industry sectors reveals that listed companies from varies industries show different usage patterns of financial reporting on Twitter. The examination of tweet sentiments also indicates a reporting bias within these tweets, as listed companies are more willing to disclose positive financial reporting tweets.
Resumo:
We discuss three approaches to the use of technology as a teaching and learning tool that we are currently implementing for a target group of about one hundred second level engineering mathematics students. Central to these approaches is the underlying theme of motivating relatively poorly motivated students to learn, with the aim of improving learning outcomes. The approaches to be discussed have been used to replace, in part, more traditional mathematics tutorial sessions and lecture presentations. In brief, the first approach involves the application of constructivist thinking in the tertiary education arena, using technology as a computational and visual tool to create motivational knowledge conflicts or crises. The central idea is to model a realistic process of how scientific theory is actually developed, as proposed by Kuhn (1962), in contrast to more standard lecture and tutorial presentations. The second approach involves replacing procedural or algorithmic pencil-and-paper skills-consolidation exercises by software based tasks. Finally, the third approach aims at creating opportunities for higher order thinking via "on-line" exploratory or discovery mode tasks. The latter incorporates the incubation period method, as originally discussed by Rubinstein (1975) and others.
Resumo:
This research provides information for providing the required seismic mitigation in building structures through the use of semi active and passive dampers. The Magneto-Rheological (MR) semi-active damper model was developed using control algorithms and integrated into seismically excited structures as a time domain function. Linear and nonlinear structure models are evaluated in real time scenarios. Research information can be used for the design and construction of earthquake safe buildings with optimally employed MR dampers and MR-passive damper combinations.
Resumo:
We investigate use of transverse beam polarization in probing anomalous coupling of a Higgs boson to a pair of vector bosons, at the International Linear Collider (ILC). We consider the most general form of V V H (V = W/Z) vertex consistent with Lorentz invariance and investigate its effects on the process e(+)e(-) -> f (f) over barH, f being a light fermion. Constructing observables with definite C P and naive time reversal ((T) over tilde) transformation properties, we find that transverse beam polarization helps us to improve on the sensitivity of one part of the anomalous Z Z H Coupling that is odd under C P. Even more importantly it provides the possibility of discriminating from each other, two terms in the general Z Z H vertex, both of which are even under C P and (T) over bar. Use of transversebeam polarization when combined with information from unpolarized and linearly polarized beams therefore, allows one to have completely independent probes of all the different parts of a general ZZH vertex.
Resumo:
Information exchange (IE) is a critical component of the complex collaborative medication process in residential aged care facilities (RACFs). Designing information and communication technology (ICT) to support complex processes requires a profound understanding of the IE that underpins their execution. There is little existing research that investigates the complexity of IE in RACFs and its impact on ICT design. The aim of this study was thus to undertake an in-depth exploration of the IE process involved in medication management to identify its implications for the design of ICT. The study was undertaken at a large metropolitan facility in NSW, Australia. A total of three focus groups, eleven interviews and two observation sessions were conducted between July to August 2010. Process modelling was undertaken by translating the qualitative data via in-depth iterative inductive analysis. The findings highlight the complexity and collaborative nature of IE in RACF medication management. These models emphasize the need to: a) deal with temporal complexity; b) rely on an interdependent set of coordinative artefacts; and c) use synchronous communication channels for coordination. Taken together these are crucial aspects of the IE process in RACF medication management that need to be catered for when designing ICT in this critical area. This study provides important new evidence of the advantages of viewing process as a part of a system rather than as segregated tasks as a means of identifying the latent requirements for ICT design and that is able to support complex collaborative processes like medication management in RACFs. © 2012 IEEE.
Resumo:
Streptococcus pneumoniae is a leading cause of pneumonia, meningitis and bacteremia worldwide. The 23-valent pneumococcal polysaccharide vaccine (PPV23) is recommended for adults less than 65 years old with certain chronic medical conditions and for all elderly persons because of high rates of invasive pneumococcal infections (IPI) and increased risk of death. This study provides a comprehensive picture of the epidemiology of pneumococcal infections in Finland before the introduction of childhood pneumococcal conjugate vaccines, focusing on disease rates, risk factors, clinical outcome, and healthcare associated infections. This study was based on national, population-based laboratory surveillance for IPI. Information on all episodes of IPI was collected from the primary diagnostic laboratory. A case with IPI was defined as the isolation of S. pneumoniae from blood or cerebrospinal fluid during 1995-2002. Information on comorbidities and underlying conditions for IPI patients was obtained by linking the IPI surveillance database to other national, population-based health registries using each patient’s unique national identity code. In total, 4357 cases of IPI were identified. The overall annualized IPI incidence increased by 35% during the study period and was 10.6 per 100 000 population. The temporal increase in disease rates was associated with higher blood culturing rates over time. In working age adults, two-thirds of severe infections and one half of fatal cases occurred in persons with no recognized PPV23 indication. Persons with asthma were at increased risk for IPI and this new risk factor accounted for 5% of the overall disease burden. One tenth of pneumococcal bacteremias were healthcare-associated, and mortality among these patients was over twice as high as among patients with community-associated bacteremia. Most patients with nosocomial infections had underlying conditions for which PPV23 is recommended. The incidence of IPI in Finland has increased and the overall disease burden is higher than previously reported. The findings of this study underscore the urgent need for improved prevention efforts against pneumococcal infections in Finland through increased use of PPV23 in adult risk groups and introduction of childhood immunization with pneumococcal conjugate vaccine.
Resumo:
Increasing antimicrobial resistance in bacteria has led to the need for better understanding of antimicrobial usage patterns. In 1999, the World Organisation for Animal Health (OIE) recommended that an international ad hoc group should be established to address human and animal health risks related to antimicrobial resistance and the contribution of antimicrobial usage in veterinary medicine. In European countries the need for continuous recording of the usage of veterinary antimicrobials as well as for animal species-specific and indication-based data on usage has been acknowledged. Finland has been among the first countries to develop prudent use guidelines in veterinary medicine, as the Ministry of Agriculture and Forestry issued the first animal species-specific indication-based recommendations for antimicrobial use in animals in 1996. These guidelines have been revised in 2003 and 2009. However, surveillance on the species-specific use of antimicrobials in animals has not been performed in Finland. This thesis provides animal species-specific information on indication-based antimicrobial usage. Different methods for data collection have been utilized. Information on antimicrobial usage in animals has been gathered in four studies (studies A-D). Material from studies A, B and C have been used in an overlapping manner in the original publications I-IV. Study A (original publications I & IV) presents a retrospective cross-sectional survey on prescriptions for small animals at the Veterinary Teaching Hospital of the University of Helsinki. Prescriptions for antimicrobial agents (n = 2281) were collected and usage patterns, such as the indication and length of treatment, were reviewed. Most of the prescriptions were for dogs (78%), and primarily for the treatment of skin and ear infections most of which were treated with cephalexin for a median period of 14 days. Prescriptions for cats (18%) were most often for the treatment of urinary tract infections with amoxicillin for a median length of 10 days. Study B (original publication II) was a retrospective cross-sectional survey where prescriptions for animals were collected from 17 University Pharmacies nationwide. Antimicrobial prescriptions (n = 1038) for mainly dogs (65%) and cats (19%) were investigated. In this study, cephalexin and amoxicillin were also the most frequently used drugs for dogs and cats, respectively. In study C (original publications III & IV), the indication-based usage of antimicrobials of practicing veterinarians was analyzed by using a prospective questionnaire. Randomly selected practicing veterinarians in Finland (n = 262) recorded all their antimicrobial usage during a 7-day study period. Cattle (46%) with mastitis were the most common patients receiving antimicrobial treatment, generally intramuscular penicillin G or intramammary treatment with ampicillin and cloxacillin. The median length of treatment was four days, regardless of the route of administration. Antimicrobial use in horses was evaluated in study D, the results of which are previously unpublished. Firstly, data collected with the prospective questionnaire from the practicing veterinarians showed that horses (n = 89) were frequently treated for skin or wound infections by using penicillin G or trimethoprim-sulfadiazine. The mean duration of treatment was five to seven days. Secondly, according to retrospective data collected from patient records, horses (n = 74) that underwent colic surgery at the Veterinary Teaching Hospital of the University of Helsinki were generally treated according to national and hospital recommendations; penicillin G and gentamicin was administered preoperatively and treatment was continued for a median of three days postoperatively. In conclusion, Finnish veterinarians followed well the national prudent use guidelines. Narrow-spectrum antimicrobials were preferred and, for instance, fluoroquinolones were used sparingly. Prescription studies seemed to give good information on antimicrobials usage, especially when combined with complementary information from patient records. A prospective questionnaire study provided a fair amount of valuable data on several animal species. Electronic surveys are worthwhile exploiting in the future.
Resumo:
One of the central issues in making efficient use of IT in the design, construction and maintenance of buildings is the sharing of the digital building data across disciplines and lifecycle stages. One technology which enables data sharing is CAD layering, which to be of real use requires the definition of standards. This paper focuses on the background, objectives and effectiveness of the International standard ISO 13567, Organisation and naming of layers for CAD. In particular the efficiency and effectiveness of the standardisation and standard implementation process are in focus, rather than the technical details. The study was conducted as a qualitative study with a number of experts who responded to a semi-structured mail questionnaire, supplemented by personal interviews. The main results were that CAD layer standards based on the ISO standard have been implemented, particularly in northern European countries, but are not very widely used. A major problem which was identified was the lack of resources for marketing and implementing the standard as national variations, once it had been formally accepted.
Resumo:
Introduction This case study is based on the experiences with the Electronic Journal of Information Technology in Construction (ITcon), founded in 1995. Development This journal is an example of a particular category of open access journals, which use neither author charges nor subscriptions to finance their operations, but rely largely on unpaid voluntary work in the spirit of the open source movement. The journal has, after some initial struggle, survived its first decade and is now established as one of half-a-dozen peer reviewed journals in its field. Operations The journal publishes articles as they become ready, but creates virtual issues through alerting messages to “subscribers”. It has also started to publish special issues, since this helps in attracting submissions, and also helps in sharing the work-load of review management. From the start the journal adopted a rather traditional layout of the articles. After the first few years the HTML version was dropped and papers are only published in PDF format. Performance The journal has recently been benchmarked against the competing journals in its field. Its acceptance rate of 53% is slightly higher and its average turnaround time of seven months almost a year faster compared to those journals in the sample for which data could be obtained. The server log files for the past three years have also been studied. Conclusions Our overall experience demonstrates that it is possible to publish this type of OA journal, with a yearly publishing volume equal to a quarterly journal and involving the processing of some fifty submissions a year, using a networked volunteer-based organization.
Resumo:
Triggered by the very quick proliferation of Internet connectivity, electronic document management (EDM) systems are now rapidly being adopted for managing the documentation that is produced and exchanged in construction projects. Nevertheless there are still substantial barriers to the efficient use of such systems, mainly of a psychological nature and related to insufficient training. This paper presents the results of empirical studies carried out during 2002 concerning the current usage of EDM systems in the Finnish construction industry. The studies employed three different methods in order to provide a multifaceted view of the problem area, both on the industry and individual project level. In order to provide an accurate measurement of overall usage volume in the industry as a whole telephone interviews with key personnel from 100 randomly chosen construction projects were conducted. The interviews showed that while around 1/3 of big projects already have adopted the use of EDM, very few small projects have adopted this technology. The barriers to introduction were investigated through interviews with representatives for half a dozen of providers of systems and ASP-services. These interviews shed a lot of light on the dynamics of the market for this type of services and illustrated the diversity of business strategies adopted by vendors. In the final study log files from a project which had used an EDM system were analysed in order to determine usage patterns. The results illustrated that use is yet incomplete in coverage and that only a part of the individuals involved in the project used the system efficiently, either as information producers or consumers. The study also provided feedback on the usefulness of the log files.
Resumo:
As companies become more efficient with respect to their internal processes, they begin to shift the focus beyond their corporate boundaries. Thus, the recent years have witnessed an increased interest by practitioners and researchers in interorganizational collaboration, which promises better firm performance through more effective supply chain management. It is no coincidence that this interest comes in parallel with the recent advancements in Information and Communication Technologies, which offer many new collaboration possibilities for companies. However, collaboration, or any other type of supply chain integration effort, relies heavily on information sharing. Hence, this study focuses on information sharing, in particular on the factors that determine it and on its value. The empirical evidence from Finnish and Swedish companies suggests that uncertainty (both demand and environmental) and dependency in terms of switching costs and asset specific investments are significant determinants of information sharing. Results also indicate that information sharing improves company performance regarding resource usage, output, and flexibility. However, companies share information more intensely at the operational rather than the strategic level. The use of supply chain practices and technologies is substantial but varies across the two countries. This study sheds light on a common trend in supply chains today. Whereas the results confirm the value of information sharing, the contingent factors help to explain why the intensity of information shared across companies differ. In the future, competitive pressures and uncertainty are likely to intensify. Therefore, companies may want to continue with their integration efforts by focusing on the determinants discussed in this study. However, at the same time, the possibility of opportunistic behavior by the exchange partner cannot be disregarded.
Resumo:
Processor architects have a challenging task of evaluating a large design space consisting of several interacting parameters and optimizations. In order to assist architects in making crucial design decisions, we build linear regression models that relate Processor performance to micro-architecture parameters, using simulation based experiments. We obtain good approximate models using an iterative process in which Akaike's information criteria is used to extract a good linear model from a small set of simulations, and limited further simulation is guided by the model using D-optimal experimental designs. The iterative process is repeated until desired error bounds are achieved. We used this procedure to establish the relationship of the CPI performance response to 26 key micro-architectural parameters using a detailed cycle-by-cycle superscalar processor simulator The resulting models provide a significance ordering on all micro-architectural parameters and their interactions, and explain the performance variations of micro-architectural techniques.
Resumo:
Mesoscale weather phenomena, such as the sea breeze circulation or lake effect snow bands, are typically too large to be observed at one point, yet too small to be caught in a traditional network of weather stations. Hence, the weather radar is one of the best tools for observing, analyzing and understanding their behavior and development. A weather radar network is a complex system, which has many structural and technical features to be tuned, from the location of each radar to the number of pulses averaged in the signal processing. These design parameters have no universal optimal values, but their selection depends on the nature of the weather phenomena to be monitored as well as on the applications for which the data will be used. The priorities and critical values are different for forest fire forecasting, aviation weather service or the planning of snow ploughing, to name a few radar-based applications. The main objective of the work performed within this thesis has been to combine knowledge of technical properties of the radar systems and our understanding of weather conditions in order to produce better applications able to efficiently support decision making in service duties for modern society related to weather and safety in northern conditions. When a new application is developed, it must be tested against ground truth . Two new verification approaches for radar-based hail estimates are introduced in this thesis. For mesoscale applications, finding the representative reference can be challenging since these phenomena are by definition difficult to catch with surface observations. Hence, almost any valuable information, which can be distilled from unconventional data sources such as newspapers and holiday shots is welcome. However, as important as getting data is to obtain estimates of data quality, and to judge to what extent the two disparate information sources can be compared. The presented new applications do not rely on radar data alone, but ingest information from auxiliary sources such as temperature fields. The author concludes that in the future the radar will continue to be a key source of data and information especially when used together in an effective way with other meteorological data.