850 resultados para Educational Measurement - methods
Resumo:
The methods of the application of the Combined didactic interactive programme system on electrical engineering disciplines has been worked out and the possibility of its application for providing a complex of different kinds of studies: lectures, tutorials, laboratory studies and also for organizing students’ independent work has been verified. The given methods provide the organization of the reproductive (recognition and reproduction) and productive heuristic educational-cognitive students’ activity in conditions of gradualness and completeness of education with the closed directed automatic control.
Resumo:
Defining 'effectiveness' in the context of community mental health teams (CMHTs) has become increasingly difficult under the current pattern of provision required in National Health Service mental health services in England. The aim of this study was to establish the characteristics of multi-professional team working effectiveness in adult CMHTs to develop a new measure of CMHT effectiveness. The study was conducted between May and November 2010 and comprised two stages. Stage 1 used a formative evaluative approach based on the Productivity Measurement and Enhancement System to develop the scale with multiple stakeholder groups over a series of qualitative workshops held in various locations across England. Stage 2 analysed responses from a cross-sectional survey of 1500 members in 135 CMHTs from 11 Mental Health Trusts in England to determine the scale's psychometric properties. Based on an analysis of its structural validity and reliability, the resultant 20-item scale demonstrated good psychometric properties and captured one overall latent factor of CMHT effectiveness comprising seven dimensions: improved service user well-being, creative problem-solving, continuous care, inter-team working, respect between professionals, engagement with carers and therapeutic relationships with service users. The scale will be of significant value to CMHTs and healthcare commissioners both nationally and internationally for monitoring, evaluating and improving team functioning in practice.
Resumo:
This paper presents MRI measurements of a novel semi solid MR contrast agent to pressure. The agent is comprised of potassium chloride cross linked carageenan gum at a concentration of 2% w/v, with micron size lipid coated bubbles of air at a concentration of 3% v/v. The choice for an optimum suspending medium, the methods of production and the preliminary MRI results are presented herein. The carageenan gum is shown to be ideally elastic for compressions relating to volume changes less than 15%, in contrast to the inelastic gellan gum also tested. Although slightly lower than that of gellan gum, carageenan has a water diffusion coefficient of 1.72×10-9 m2.s-1 indicating its suitability to this purpose. RARE imaging is performed whilst simultaneously compressing test and control samples and a maximum sensitivity of 1.6% MR signal change per % volume change is found which is shown to be independent of proton density variations due to the presence of microbubbles and compression. This contrast agent could prove useful for numerous applications, and particularly in chemical engineering. More generally the method allows the user to non-invasively image with MRI any process that causes, within the solid, local changes either in bubble size or bubble shape. © 2008 American Institute of Physics.
Resumo:
This paper reviews the state of the art in measuring, modeling, and managing clogging in subsurface-flow treatment wetlands. Methods for measuring in situ hydraulic conductivity in treatment wetlands are now available, which provide valuable insight into assessing and evaluating the extent of clogging. These results, paired with the information from more traditional approaches (e.g., tracer testing and composition of the clog matter) are being incorporated into the latest treatment wetland models. Recent finite element analysis models can now simulate clogging development in subsurface-flow treatment wetlands with reasonable accuracy. Various management strategies have been developed to extend the life of clogged treatment wetlands, including gravel excavation and/or washing, chemical treatment, and application of earthworms. These strategies are compared and available cost information is reported. © 2012 Elsevier Ltd.
Resumo:
The present paper is devoted to creation of cryptographic data security and realization of the packet mode in the distributed information measurement and control system that implements methods of optical spectroscopy for plasma physics research and atomic collisions. This system gives a remote access to information and instrument resources within the Intranet/Internet networks. The system provides remote access to information and hardware resources for the natural sciences within the Intranet/Internet networks. The access to physical equipment is realized through the standard interface servers (PXI, CАМАC, and GPIB), the server providing access to Ethernet devices, and the communication server, which integrates the equipment servers into a uniform information system. The system is used to make research task in optical spectroscopy, as well as to support the process of education at the Department of Physics and Engineering of Petrozavodsk State University.
Resumo:
This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots.
Resumo:
In dimensional metrology, often the largest source of uncertainty of measurement is thermal variation. Dimensional measurements are currently scaled linearly, using ambient temperature measurements and coefficients of thermal expansion, to ideal metrology conditions at 20˚C. This scaling is particularly difficult to implement with confidence in large volumes as the temperature is unlikely to be uniform, resulting in thermal gradients. A number of well-established computational methods are used in the design phase of product development for the prediction of thermal and gravitational effects, which could be used to a greater extent in metrology. This paper outlines the theory of how physical measurements of dimension and temperature can be combined more comprehensively throughout the product lifecycle, from design through to the manufacturing phase. The Hybrid Metrology concept is also introduced: an approach to metrology, which promises to improve product and equipment integrity in future manufacturing environments. The Hybrid Metrology System combines various state of the art physical dimensional and temperature measurement techniques with established computational methods to better predict thermal and gravitational effects.
Resumo:
Incorporating Material Balance Principle (MBP) in industrial and agricultural performance measurement systems with pollutant factors has been on the rise in recent years. Many conventional methods of performance measurement have proven incompatible with the material flow conditions. This study will address the issue of eco-efficiency measurement adjusted for pollution, taking into account materials flow conditions and the MBP requirements, in order to provide ‘real’ measures of performance that can serve as guides when making policies. We develop a new approach by integrating slacks-based measure to enhance the Malmquist Luenberger Index by a material balance condition that reflects the conservation of matter. This model is compared with a similar model, which incorporates MBP using the trade-off approach to measure productivity and eco-efficiency trends of power plants. Results reveal similar findings for both models substantiating robustness and applicability of the proposed model in this paper.
Resumo:
Purpose - To investigate if the accuracy of intraocular pressure (IOP) measurements using rebound tonometry over disposable hydrogel (etafilcon A) contact lenses (CL) is affected by the positive power of the CLs. Methods - The experimental group comprised 26 subjects, (8 male, 18 female). IOP measurements were undertaken on the subjects’ right eyes in random order using a Rebound Tonometer (ICare). The CLs had powers of +2.00 D and +6.00 D. Measurements were taken over each contact lens and also before and after the CLs had been worn. Results - The IOP measure obtained with both CLs was significantly lower compared to the value without CLs (t test; p < 0.001) but no significant difference was found between the two powers of CLs. Conclusions - Rebound tonometry over positive hydrogel CLs leads to a certain degree of IOP underestimation. This result did not change for the two positive lenses used in the experiment, despite their large difference in power and therefore in lens thickness. Optometrists should bear this in mind when measuring IOP with the rebound tonometer over plus power contact lenses.
Resumo:
Aims: Specialist lifestyle management (SLiM) is a medically supported dietetically led structured group education and self-management programme focusing on weight management. Obese patients with Type 2 diabetes are perceived to find it more difficult to lose weight compared with those without diabetes. We aimed to compare the weight loss achieved by obese patients with or without Type 2 diabetes completing the SLiM programme. Methods: A prospective analysis of patients attending SLiM between 2009 and 2013 was conducted. Results: There were 454 obese patients (mean age 49.1 ± 11.6years, women 72.5%, body mass index 49.8 ± 9.3kg/m2, weight137.3 ± 28kg). 152/454 patients (33%) had Type 2 diabetes of which 31 (20.4%) were insulin treated. Patients with Type 2diabetes were older (52.4 ± 11.3 vs 47.5 ± 11.4 years, p < 0.001). SLiM resulted in significant weight loss in patients with (136.5 ± 27 vs 130.2 ± 25.3, p < 0.001) or without (137.6 ± 29 vs 132.6 ± 28.4, p < 0.001) Type 2 diabetes. Weight loss was comparable between patients with and without Type 2 diabetes (6.1 ± 7.9 vs5.1 ± 7kg, p = 0.2). The proportion of patients achieving ≥ 10%weight loss was similar between patients with and without Type 2diabetes (10.5% vs 9.9%, p = 0.4). Insulin-treated patients lost similar weight to those not treated with insulin (6.3 ± 9.4 vs 6.1 ± 7.6kg, p = 0.9). After adjustment for age, sex, referral weight and medications, Type 2 diabetes did not predict weight change during the SLiM programme (b = 0.3, p = 0.5). Conclusions: Attending the SLiM groups produces a significant weight loss in patients with Type 2 diabetes which is comparable to those without Type 2 diabetes. Insulin-treated patients lost similar weight to those not on insulin. Weight gain with Type 2 diabetes and insulin treatment is not ‘unavoidable’ if patients receive the appropriate support and education.
Resumo:
Bármennyire szeretne is egy bank (vállalat, biztosító) csak az üzletre koncentrálni, nem térhet ki a pénzügyi (hitel-, piaci, operációs, egyéb) kockázatok elől, amelyeket mérnie és fedeznie kell. A teljes fedezés vagy nagyon költséges, vagy nem is lehetséges, így a csőd elkerülésre minden gazdálkodó egységnek tartania kell valamennyi kockázatmentes, likvid tőkét. Koherens kockázatmérésre van szükség: az allokált tőkének tükröznie kell a kockázatokat - azonban még akkor is felmerül elosztási probléma, ha jól tudjuk mérni azokat. A diverzifikációs hatásoknak köszönhetően egy portfólió teljes kockázata általában kisebb, mint a portfóliót alkotó alportfóliók kockázatának összege. A koherens tőkeallokáció során azzal a kérdéssel kell foglalkoznunk, hogy mennyi tőkét osszunk az alportfóliókra, vagyis hogyan osszuk el „korrekt” módon a diverzifikáció előnyeit. Így megkapjuk az eszközök kockázathoz való hozzájárulását. A tanulmányban játékelmélet alkalmazásával, összetett opciós példákon keresztül bemutatjuk a kockázatok következetes mérését és felosztását, felhívjuk a figyelmet a következetlenségek veszélyeire, valamint megvizsgáljuk, hogy a gyakorlatban alkalmazott kockázatmérési módszerek [különösen a kockáztatott érték (VaR)] mennyire felelnek meg az elmélet által szabott követelményeknek. ____________________ However much a bank (or company or insurance provider) concentrates only on business, it cannot avoid financial (credit, market, operational or other) risks that need to be measured and covered. Total cover is either very expensive or not even possible, so that every business unit has to hold some risk-free liquid capital to avoid insolvency. What it needs is coherent risk measurement: the capital allocated has to match the risks, but even if the risks are measured well, distribution problems can still arise. Thanks to diversification effects, the total risk of a portfolio is less than the sum of the risks of its sub-portfolios. Coherent capital allocation entails addressing the question of how much capital to divide among the sub-portfolios, or how to distribute ‘correctly’ the advantages of diversification. This yields the contribution of the assets to the risk. The study employs game theory and examples of compound options to demonstrate coherent measurement and distribution of risks. Attention is drawn to the dangers of inconsistencies. The authors examine how far the methods of risk measurement applied in practice (notably VaR—value at risk) meet the requirements set in theory.
Resumo:
Clusters are aggregations of atoms or molecules, generally intermediate in size between individual atoms and aggregates that are large enough to be called bulk matter. Clusters can also be called nanoparticles, because their size is on the order of nanometers or tens of nanometers. A new field has begun to take shape called nanostructured materials which takes advantage of these atom clusters. The ultra-small size of building blocks leads to dramatically different properties and it is anticipated that such atomically engineered materials will be able to be tailored to perform as no previous material could.^ The idea of ionized cluster beam (ICB) thin film deposition technique was first proposed by Takagi in 1972. It was based upon using a supersonic jet source to produce, ionize and accelerate beams of atomic clusters onto substrates in a vacuum environment. Conditions for formation of cluster beams suitable for thin film deposition have only recently been established following twenty years of effort. Zinc clusters over 1,000 atoms in average size have been synthesized both in our lab and that of Gspann. More recently, other methods of synthesizing clusters and nanoparticles, using different types of cluster sources, have come under development.^ In this work, we studied different aspects of nanoparticle beams. The work includes refinement of a model of the cluster formation mechanism, development of a new real-time, in situ cluster size measurement method, and study of the use of ICB in the fabrication of semiconductor devices.^ The formation process of the vaporized-metal cluster beam was simulated and investigated using classical nucleation theory and one dimensional gas flow equations. Zinc cluster sizes predicted at the nozzle exit are in good quantitative agreement with experimental results in our laboratory.^ A novel in situ real-time mass, energy and velocity measurement apparatus has been designed, built and tested. This small size time-of-flight mass spectrometer is suitable to be used in our cluster deposition systems and does not suffer from problems related to other methods of cluster size measurement like: requirement for specialized ionizing lasers, inductive electrical or electromagnetic coupling, dependency on the assumption of homogeneous nucleation, limits on the size measurement and non real-time capability. Measured ion energies using the electrostatic energy analyzer are in good accordance with values obtained from computer simulation. The velocity (v) is measured by pulsing the cluster beam and measuring the time of delay between the pulse and analyzer output current. The mass of a particle is calculated from m = (2E/v$\sp2).$ The error in the measured value of background gas mass is on the order of 28% of the mass of one N$\sb2$ molecule which is negligible for the measurement of large size clusters. This resolution in cluster size measurement is very acceptable for our purposes.^ Selective area deposition onto conducting patterns overlying insulating substrates was demonstrated using intense, fully-ionized cluster beams. Parameters influencing the selectivity are ion energy, repelling voltage, the ratio of the conductor to insulator dimension, and substrate thickness. ^
Resumo:
This study compared the performance of students who earned GED credentials in Florida with that of graduates of Florida high schools, when members of both groups enrolled for the first time in fall 1992 at an urban multicultural community college in south Florida. GED's and HSD's were matched on gender, race, age range, placement levels, and enrollment in college preparatory courses (reading, English, mathematics). The paired samples t-test compared course grades, first semester GPA, and total college GPA for the groups and subgroups of matched students at a probability level of .05. The McNemar test compared how many students in each group and subgroup re-enrolled for a second and third term, or ever; how many were placed on special academic status during their college enrollment; and how many graduated within 16 semesters. Differences between groups were found only for placement on probation—with HSD's on probation in significantly higher proportion than GED's. ^ Additional findings among subgroups revealed that male and Caucasian HSD subjects earned higher math grades than their GED counterparts. Male HSD's were more likely than male GED's to return to the college at some point after the first term. However, male HSD's were placed on probation in greater proportion than the GED's with whom they were matched. ^ Female GED's earned higher English grades and higher first semester and cumulative GPA's and returned to the college in greater proportion than their HSD counterparts. Black GED's earned higher first-semester GPA's, re-enrolled in terms 2 and 3 and graduated from the college in higher percentages than Black HSD's. Black HSD's were placed on probation in higher proportion than Black GED's. Lastly, greater percentages of HSD than GED subjects in the lowest age range (16–19) were placed on probation. ^ Results connected to the performance of Black GED subjects are likely to have been affected by the fact that 50% of Black study subjects had been born in Jamaica. The place of the GED in the constellation of methods for earning credit by examination is explored, future implications are discussed, and further study is recommended. ^
Resumo:
Accurate knowledge of the time since death, or postmortem interval (PMI), has enormous legal, criminological, and psychological impact. In this study, an investigation was made to determine whether the relationship between the degradation of the human cardiac structure protein Cardiac Troponin T and PMI could be used as an indicator of time since death, thus providing a rapid, high resolution, sensitive, and automated methodology for the determination of PMI. ^ The use of Cardiac Troponin T (cTnT), a protein found in heart tissue, as a selective marker for cardiac muscle damage has shown great promise in the determination of PMI. An optimized conventional immunoassay method was developed to quantify intact and fragmented cTnT. A small sample of cardiac tissue, which is less affected than other tissues by external factors, was taken, homogenized, extracted with magnetic microparticles, separated by SDS-PAGE, and visualized with Western blot by probing with monoclonal antibody against cTnT. This step was followed by labeling and available scanners. This conventional immunoassay provides a proper detection and quantitation of cTnT protein in cardiac tissue as a complex matrix; however, this method does not provide the analyst with immediate results. Therefore, a competitive separation method using capillary electrophoresis with laser-induced fluorescence (CE-LIF) was developed to study the interaction between human cTnT protein and monoclonal anti-TroponinT antibody. ^ Analysis of the results revealed a linear relationship between the percent of degraded cTnT and the log of the PMI, indicating that intact cTnT could be detected in human heart tissue up to 10 days postmortem at room temperature and beyond two weeks at 4C. The data presented demonstrates that this technique can provide an extended time range during which PMI can be more accurately estimated as compared to currently used methods. The data demonstrates that this technique represents a major advance in time of death determination through a fast and reliable, semi-quantitative measurement of a biochemical marker from an organ protected from outside factors. ^
Resumo:
Along with the accumulation of evidence supporting the role of entrepreneurship in economic development (Acs & Armington, 2006; Kuratko, 2005, Reynolds, 2007), governments have persisted in encouraging people to become entrepreneurs (Acs & Stough, 2008; Brannback & Carsrud, 2008). These efforts have tried to reproduce the conditions under which entrepreneurship emerges. One of these conditions is to develop entrepreneurial skills among students and scientists (Fan & Foo, 2004). Entrepreneurship education within higher education has experienced a remarkable expansion in the last 20 years (Green, 2008). To develop entrepreneurial skills among students, scholars have proposed different teaching approaches. However, no clear relationship has been demonstrated between entrepreneurship education, learning outcomes, and business creation (Hostager & Decker, 1999). Despite policy makers demands for more accountability from educational institutions (Klimoski, 2007) and entrepreneurship instructors demands for consistency about what should be taught and how (Maidment, 2009), the appropriate content for entrepreneurship programs remains under constant discussion (Solomon, 2007). Entrepreneurship education is still in its infancy, professors propose diverse teaching goals and radically different teaching methods. This represents an obstacle to development of foundational and consistent curricula across the board (Cone, 2008). Entrepreneurship education is in need of a better conceptualization of the learning outcomes pursued in order to develop consistent curriculum. Many schools do not have enough qualified faculty to meet the growing student demand and a consistent curriculum is needed for faculty development. Entrepreneurship instructors and their teaching practices are of interest because they have a role in producing the entrepreneurs needed to grow the economy. This study was designed to understand instructors’ perspectives and actions related to their teaching. The sample studied consisted of eight college and university entrepreneurship instructors. Cases met predetermined criteria of importance followed maximum variation strategies. Results suggest that teaching content were consistent across participants while different teaching goals were identified: some instructors inspire and develop general skills of students while others envision the creation of a real business as the major outcome of their course. A relationship between methods reported by instructors and their disciplinary background, teaching perspective, and entrepreneurial experience was found.