835 resultados para 080601 Aboriginal and Torres Strait Islander Information and Knowledge Systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the organizational structures and decision-making processes used by school districts to recruit and hire school librarians. For students to acquire the information and technology literacy education they need, school libraries must be staffed with qualified individuals who can fulfill the librarian’s role as leader, teacher, instructional partner, information specialist, and program administrator. Principals are typically given decision rights for hiring staff, including school librarians. Research shows that principals have limited knowledge of the skills and abilities of the school librarian or the specific needs and functions of the library program. Research also indicates that those with specific knowledge of school library programs, namely school district library supervisors, are only consulted on recruiting and hiring about half the time. School districts entrust library supervisors with responsibilities such as professional development of school librarians only after they are hired. This study uses a theoretical lens from research on IT governance, which focuses on the use of knowledge-fit in applying decision rights in an organization. This framework is appropriate because of its incorporation of a specialist with a specific knowledge set in determining the placement of input and decision rights in the decision-making processes. The method used in this research was a multiple-case study design using five school districts as cases, varying by the involvement of the supervisors and other individuals in the hiring process. The data collected from each school district were interviews about the district’s recruiting and hiring practices with principals, an individual in HR, library supervisors, and recently hired school librarians. Data analysis was conducted through iterative coding from themes in the research questions, with continuous adjustments as new themes developed. Results from the study indicate that governance framework is applicable to evaluating the decision-making processes used in recruiting and hiring school librarians. However, a district’s use of governance did not consistently use knowledge-fit in the determination of input and decision rights. In the hiring process, governance was more likely to be based on placing decision rights at a certain level of the district hierarchy rather than the location of specific knowledge, most often resulting in site-based governance for decision rights at the school-building level. The governance of the recruiting process was most affected by the shortage or surplus of candidates available to the district to fill positions. Districts struggling with a shortage of candidates typically placed governance for the decision-making process on recruiting at the district level, giving the library supervisor more opportunity for input and collaboration with human resources. In districts that use site-based governance and that place all input and decision rights at the building level, some principals use their autonomy to eliminate the school library position in the allotment phase or hire librarians that, while certified through testing, do not have the same level of expertise as those who achieve certification through LIS programs. The principals in districts who use site-based governance for decision rights but call on the library supervisor for advisement stated how valuable they found the supervisor’s expertise in evaluating candidates for hire. In no district was a principal or school required to involve the library supervisor in the hiring of school librarians. With a better understanding of the tasks involved, the effect of district governance on decision-making, and the use of knowledge to assign input and decision rights, it is possible to look at how all of these factors affect the outcome in the quality of the hire. A next step is to look at the hiring process that school librarians went through and connect those with the measurable outcomes of hiring: school librarian success, retention, and attrition; the quality of school library program services, outreach, and involvement in a school; and the perceptions of the success of the school librarian and the library program as seen from students, teachers, administrators, parents, and other community stakeholders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 19: Knowledge Management in Networks

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 8: Business Strategies Alignment

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of silvopastoral systems (SPS) can be a good alternative to reduce the environmental impacts of livestock breeding in Brazil. Despite the advantages offered by public policies, many producers hesitate to use this system. One of the reasons is the lack of information on health and productivity of cattle raised under these conditions. The experiment reported here was designed to compare the behavior of infection by gastrointestinal nematodes and weight gain of beef cattle raised in a SPS and a conventional pasture system. We monitored the number of eggs per gram of feces, the prevalent nematode genus, data on climate, forage availability, weight gain and packed cell volume (PCV) of the animals bred in the two systems. The infection by nematodes was significantly higher in the cattle raised in the SPS (p\0.05). The coprocultures revealed the presence of nematodes of the genera Haemonchus, Cooperia, Oesophagostomum and Trichostrongylus, in both systems, but the mean infestation rates of Haemonchus and Cooperia were higher in the SPS (p\0.05). The average of PCV values did not differ between the cattle in the two systems. The individual weight gain and stocking rate in the period did not vary between the systems (p[0.05). Despite the higher prevalence of nematodes in the SPS, no negative impact was detected on the animals? weight gain and health. The results of this experiment indicate that under the conditions studied, there is no need to alter the parasite management to assure good productive performance of cattle

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of building envelopes and roofing systems significantly depends on accurate knowledge of wind loads and the response of envelope components under realistic wind conditions. Wind tunnel testing is a well-established practice to determine wind loads on structures. For small structures much larger model scales are needed than for large structures, to maintain modeling accuracy and minimize Reynolds number effects. In these circumstances the ability to obtain a large enough turbulence integral scale is usually compromised by the limited dimensions of the wind tunnel meaning that it is not possible to simulate the low frequency end of the turbulence spectrum. Such flows are called flows with Partial Turbulence Simulation.^ In this dissertation, the test procedure and scaling requirements for tests in partial turbulence simulation are discussed. A theoretical method is proposed for including the effects of low-frequency turbulences in the post-test analysis. In this theory the turbulence spectrum is divided into two distinct statistical processes, one at high frequencies which can be simulated in the wind tunnel, and one at low frequencies which can be treated in a quasi-steady manner. The joint probability of load resulting from the two processes is derived from which full-scale equivalent peak pressure coefficients can be obtained. The efficacy of the method is proved by comparing predicted data derived from tests on large-scale models of the Silsoe Cube and Texas-Tech University buildings in Wall of Wind facility at Florida International University with the available full-scale data.^ For multi-layer building envelopes such as rain-screen walls, roof pavers, and vented energy efficient walls not only peak wind loads but also their spatial gradients are important. Wind permeable roof claddings like roof pavers are not well dealt with in many existing building codes and standards. Large-scale experiments were carried out to investigate the wind loading on concrete pavers including wind blow-off tests and pressure measurements. Simplified guidelines were developed for design of loose-laid roof pavers against wind uplift. The guidelines are formatted so that use can be made of the existing information in codes and standards such as ASCE 7-10 on pressure coefficients on components and cladding.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sweet potato is an important strategic agricultural crop grown in many countries around the world. The roots and aerial vine components of the crop are used for both human consumption and, to some extent as a cheap source of animal feed. In spite of its economic value and growing contribution to health and nutrition, harvested sweet potato roots and aerial vine components has limited shelf-life and is easily susceptible to post-harvest losses. Although post-harvest losses of both sweet potato roots and aerial vine components is significant, there is no information available that will support the design and development of appropriate storage and preservation systems. In this context, the present study was initiated to improve scientific knowledge about sweet potato post-harvest handling. Additionally, the study also seeks to develop a PV ventilated mud storehouse for storage of sweet potato roots under tropical conditions. In study one, airflow resistance of sweet potato aerial vine components was investigated. The influence of different operating parameters such as airflow rate, moisture content and bulk depth at different levels on airflow resistance was analyzed. All the operating parameters were observed to have significant (P < 0.01) effect on airflow resistance. Prediction models were developed and were found to adequately describe the experimental pressure drop data. In study two, the resistance of airflow through unwashed and clean sweet potato roots was investigated. The effect of sweet potato roots shape factor, surface roughness, orientation to airflow, and presence of soil fraction on airflow resistance was also assessed. The pressure drop through unwashed and clean sweet potato roots was observed to increase with higher airflow, bed depth, root grade composition, and presence of soil fraction. The physical properties of the roots were incorporated into a modified Ergun model and compared with a modified Shedd’s model. The modified Ergun model provided the best fit to the experimental data when compared with the modified Shedd’s model. In study three, the effect of sweet potato root size (medium and large), different air velocity and temperature on the cooling/or heating rate and time of individual sweet potato roots were investigated. Also, a simulation model which is based on the fundamental solution of the transient equations was proposed for estimating the cooling and heating time at the centre of sweet potato roots. The results showed that increasing air velocity during cooling and heating significantly (P < 0.05) affects the cooling and heating times. Furthermore, the cooling and heating times were significantly different (P < 0.05) among medium and large size sweet potato roots. Comparison of the simulation results with experimental data confirmed that the transient simulation model can be used to accurately estimate the cooling and heating times of whole sweet potato roots under forced convection conditions. In study four, the performance of charcoal evaporative cooling pad configurations for integration into sweet potato roots storage systems was investigated. The experiments were carried out at different levels of air velocity, water flow rates, and three pad configurations: single layer pad (SLP), double layers pad (DLP) and triple layers pad (TLP) made out of small and large size charcoal particles. The results showed that higher air velocity has tremendous effect on pressure drop. Increasing the water flow rate above the range tested had no practical benefits in terms of cooling. It was observed that DLP and TLD configurations with larger wet surface area for both types of pads provided high cooling efficiencies. In study five, CFD technique in the ANSYS Fluent software was used to simulate airflow distribution in a low-cost mud storehouse. By theoretically investigating different geometries of air inlet, plenum chamber, and outlet as well as its placement using ANSYS Fluent software, an acceptable geometry with uniform air distribution was selected and constructed. Experimental measurements validated the selected design. In study six, the performance of the developed PV ventilated system was investigated. Field measurements showed satisfactory results of the directly coupled PV ventilated system. Furthermore, the option of integrating a low-cost evaporative cooling system into the mud storage structure was also investigated. The results showed a reduction of ambient temperature inside the mud storehouse while relative humidity was enhanced. The ability of the developed storage system to provide and maintain airflow, temperature and relative humidity which are the key parameters for shelf-life extension of sweet potato roots highlight its ability to reduce post-harvest losses at the farmer level, particularly under tropical climate conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metadata that is associated with either an information system or an information object for purposes of description, administration, legal requirements, technical functionality, use and usage, and preservation, plays a critical role in ensuring the creation, management, preservation and use and re-use of trustworthymaterials, including records. Recordkeeping1 metadata, of which one key type is archival description, plays a particularly important role in documenting the reliability and authenticity of records and recordkeeping systemsas well as the various contexts (legal-administrative, provenancial, procedural, documentary, and technical) within which records are created and kept as they move across space and time. In the digital environment, metadata is also the means by which it is possible to identify how record components – those constituent aspects of a digital record that may be managed, stored and used separately by the creator or the preserver – can be reassembled to generate an authentic copy of a record or reformulated per a user’s request as a customized output package.Issues relating to the creation, capture, management and preservation of adequate metadata are, therefore, integral to any research study addressing the reliability and authenticity of digital entities, regardless of the community, sector or institution within which they are being created. The InterPARES 2 Description Cross-Domain Group (DCD) examined the conceptualization, definitions, roles, and current functionality of metadata and archival description in terms of requirements generated by InterPARES 12. Because of the needs to communicate the work of InterPARES in a meaningful way across not only other disciplines, but also different archival traditions; to interface with, evaluate and inform existing standards, practices and other research projects; and to ensure interoperability across the three focus areas of InterPARES2, the Description Cross-Domain also addressed its research goals with reference to wider thinking about and developments in recordkeeping and metadata. InterPARES2 addressed not only records, however, but a range of digital information objects (referred to as “entities” by InterPARES 2, but not to be confused with the term “entities” as used in metadata and database applications) that are the products and by-products of government, scientific and artistic activities that are carried out using dynamic, interactive or experiential digital systems. The nature of these entities was determined through a diplomatic analysis undertaken as part of extensive case studies of digital systems that were conducted by the InterPARES 2 Focus Groups. This diplomatic analysis established whether the entities identified during the case studies were records, non-records that nevertheless raised important concerns relating to reliability and authenticity, or “potential records.” To be determined to be records, the entities had to meet the criteria outlined by archival theory – they had to have a fixed documentary format and stable content. It was not sufficient that they be considered to be or treated as records by the creator. “Potential records” is a new construct that indicates that a digital system has the potential to create records upon demand, but does not actually fix and set aside records in the normal course of business. The work of the Description Cross-Domain Group, therefore, addresses the metadata needs for all three categories of entities.Finally, since “metadata” as a term is used today so ubiquitously and in so many different ways by different communities, that it is in peril of losing any specificity, part of the work of the DCD sought to name and type categories of metadata. It also addressed incentives for creators to generate appropriate metadata, as well as issues associated with the retention, maintenance and eventual disposition of the metadata that aggregates around digital entities over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much of the real-world dataset, including textual data, can be represented using graph structures. The use of graphs to represent textual data has many advantages, mainly related to maintaining a more significant amount of information, such as the relationships between words and their types. In recent years, many neural network architectures have been proposed to deal with tasks on graphs. Many of them consider only node features, ignoring or not giving the proper relevance to relationships between them. However, in many node classification tasks, they play a fundamental role. This thesis aims to analyze the main GNNs, evaluate their advantages and disadvantages, propose an innovative solution considered as an extension of GAT, and apply them to a case study in the biomedical field. We propose the reference GNNs, implemented with methodologies later analyzed, and then applied to a question answering system in the biomedical field as a replacement for the pre-existing GNN. We attempt to obtain better results by using models that can accept as input both node and edge features. As shown later, our proposed models can beat the original solution and define the state-of-the-art for the task under analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Natural Language Processing (NLP) has seen tremendous improvements over the last few years. Transformer architectures achieved impressive results in almost any NLP task, such as Text Classification, Machine Translation, and Language Generation. As time went by, transformers continued to improve thanks to larger corpora and bigger networks, reaching hundreds of billions of parameters. Training and deploying such large models has become prohibitively expensive, such that only big high tech companies can afford to train those models. Therefore, a lot of research has been dedicated to reducing a model’s size. In this thesis, we investigate the effects of Vocabulary Transfer and Knowledge Distillation for compressing large Language Models. The goal is to combine these two methodologies to further compress models without significant loss of performance. In particular, we designed different combination strategies and conducted a series of experiments on different vertical domains (medical, legal, news) and downstream tasks (Text Classification and Named Entity Recognition). Four different methods involving Vocabulary Transfer (VIPI) with and without a Masked Language Modelling (MLM) step and with and without Knowledge Distillation are compared against a baseline that assigns random vectors to new elements of the vocabulary. Results indicate that VIPI effectively transfers information of the original vocabulary and that MLM is beneficial. It is also noted that both vocabulary transfer and knowledge distillation are orthogonal to one another and may be applied jointly. The application of knowledge distillation first before subsequently applying vocabulary transfer is recommended. Finally, model performance due to vocabulary transfer does not always show a consistent trend as the vocabulary size is reduced. Hence, the choice of vocabulary size should be empirically selected by evaluation on the downstream task similar to hyperparameter tuning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The power transformer is a piece of electrical equipment that needs continuous monitoring and fast protection since it is very expensive and an essential element for a power system to perform effectively. The most common protection technique used is the percentage differential logic, which provides discrimination between an internal fault and different operating conditions. Unfortunately, there are some operating conditions of power transformers that can affect the protection behavior and the power system stability. This paper proposes the development of a new algorithm to improve the differential protection performance by using fuzzy logic and Clarke`s transform. An electrical power system was modeled using Alternative Transients Program (ATP) software to obtain the operational conditions and fault situations needed to test the algorithm developed. The results were compared to a commercial relay for validation, showing the advantages of the new method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The photodegradation of the herbicide clomazone in the presence of S(2)O(8)(2-) or of humic substances of different origin was investigated. A value of (9.4 +/- 0.4) x 10(8) m(-1) s(-1) was measured for the bimolecular rate constant for the reaction of sulfate radicals with clomazone in flash-photolysis experiments. Steady state photolysis of peroxydisulfate, leading to the formation of the sulfate radicals, in the presence of clomazone was shown to be an efficient photodegradation method of the herbicide. This is a relevant result regarding the in situ chemical oxidation procedures involving peroxydisulfate as the oxidant. The main reaction products are 2-chlorobenzylalcohol and 2-chlorobenzaldehyde. The degradation kinetics of clomazone was also studied under steady state conditions induced by photolysis of Aldrich humic acid or a vermicompost extract (VCE). The results indicate that singlet oxygen is the main species responsible for clomazone degradation. The quantum yield of O(2)(a(1)Delta(g)) generation (lambda = 400 nm) for the VCE in D(2)O, Phi(Delta) = (1.3 +/- 0.1) x 10(-3), was determined by measuring the O(2)(a(1)Delta(g)) phosphorescence at 1270 nm. The value of the overall quenching constant of O(2)(a(1)Delta(g)) by clomazone was found to be (5.7 +/- 0.3) x 10(7) m(-1) s(-1) in D(2)O. The bimolecular rate constant for the reaction of clomazone with singlet oxygen was k(r) = (5.4 +/- 0.1) x 10(7) m(-1) s(-1), which means that the quenching process is mainly reactive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nitrogen fertilization in common bean crops under no-tillage and conventional systems. Nitrogen fertilizer is necessary for high yields in common bean crops and N responses under conditions of no-tillage and conventional systems are still basic needs. Thus, the objective of this research was to evaluate the effect of N application and common bean yield in no-tillage and conventional systems. The experimental design was a randomized block in a factorial scheme (2x8+1) with four replications. The treatments were constituted by the combination of two N doses (40 and 80 kg ha(-1)) applied at side dressing at eight distinct stadia during vegetative development of the common bean (V(4-3), V(4-4), V(4-5), V(4-6), V(4-7), V(4-8), V(4-9) and V(4-10)), in addition to a control plot without N in side dressing. The experiment was conducted over two years (2002 and 2003) in no-tillage on millet crop residues and conventional plow system. It was concluded that N fertilizer at the V(4) stadium of common bean promotes similar seed yields in no-tillage and conventional systems. Yield differences between no-tillage and conventional systems are inconsistent in the same agricultural area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to compare the effects of Low-intensity Laser Therapy (LILT) and Light Emitting Diode Therapy (LEDT) of low intensity on the treatment of lesioned Achilles tendon of rats. The experimental model consisted of a partial mechanical lesion on the right Achilles tendon deep portion of 90 rats. One hour after the lesion, the injured animals received applications of laser/LED (685, 830/630, 880 nm), and the same procedure was repeated at 24-h intervals, for 10 days. The healing process and deposition of collagen were evaluated based on a polarization microscopy analysis of the alignment and organization of collagen bundles, through the birefringence (optical retardation-OR). The results showed a real efficiency of treatments based on LEDT and confirmed that LILT seems to be effective on healing process. Although absence of coherence of LED light, tendon healing treatment with this feature was satisfactory and can certainly replace treatments based on laser light applications. Applications of infrared laser at 830 nm and LED 880 nm were more efficient when the aim is a good organization, aggregation, and alignment of the collagen bundles on tendon healing. However, more research is needed for a safety and more efficient determination of a protocol with LED.