975 resultados para Test content


Relevância:

30.00% 30.00%

Publicador:

Resumo:

State standardized testing has always been a tool to measure a school’s performance and to help evaluate school curriculum. However, with the school of choice legislation in 1992, the MEAP test became a measuring stick to grade schools by and a major tool in attracting school of choice students. Now, declining enrollment and a state budget struggling to stay out of the red have made school of choice students more important than ever before. MEAP scores have become the deciding factor in some cases. For the past five years, the Hancock Middle School staff has been working hard to improve their students’ MEAP scores in accordance with President Bush's “No Child Left Behind” legislation. In 2005, the school was awarded a grant that enabled staff to work for two years on writing and working towards school goals that were based on the improvement of MEAP scores in writing and math. As part of this effort, the school purchased an internet-based program geared at giving students practice on state content standards. This study examined the results of efforts by Hancock Middle School to help improve student scores in mathematics on the MEAP test through the use of an online program called “Study Island.” In the past, the program was used to remediate students, and as a review with an incentive at the end of the year for students completing a certain number of objectives. It had also been used as a review before upcoming MEAP testing in the fall. All of these methods may have helped a few students perform at an increased level on their standardized test, but the question remained of whether a sustained use of the program in a classroom setting would increase an understanding of concepts and performance on the MEAP for the masses. This study addressed this question. Student MEAP scores and Study Island data from experimental and comparison groups of students were compared to understand how a sustained use of Study Island in the classroom would impact student test scores on the MEAP. In addition, these data were analyzed to determine whether Study Island results provide a good indicator of students’ MEAP performance. The results of the study suggest that there were limited benefits related to sustained use of Study Island and gave some indications about the effectiveness of the mathematics curriculum at Hancock Middle School. These results and implications for instruction are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of Si and cooling rate are investigated for their effect on the mechanical properties and microstructure. Three alloys were chosen with varying C and Si contents and an attempt to keep the remainder of the elements present constant. Within each heat, three test blocks were poured. Two blocks had chills – one with a fluid flowing through it to cool it (active chill) and one without the fluid (passive) – and the third block did not have a chill. Cooling curves were gathered and analyzed. The mechanical properties of the castings were correlated to the microstructure, cooling rate and Si content of each block. It was found that an increase in Si content increased the yield stress, tensile strength and hardness but decreased the impact toughness, elongation and Young’s modulus. The fast cooling rates produced by the chills caused a high nodule count in the castings along with a fine ferrite grain size and a high degree of nodularity. The fine microstructures, in turn, increased the strength and ductile to brittle transition temperature (DBTT) of the castings. The fast cooling rate was not adequate to overcome the dramatic increase in DBTT that is caused by the addition of Si.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The three-step test is central to the regulation of copyright limitations at the international level. Delineating the room for exemptions with abstract criteria, the three-step test is by far the most important and comprehensive basis for the introduction of national use privileges. It is an essential, flexible element in the international limitation infrastructure that allows national law makers to satisfy domestic social, cultural, and economic needs. Given the universal field of application that follows from the test’s open-ended wording, the provision creates much more breathing space than the more specific exceptions recognized in international copyright law. EC copyright legislation, however, fails to take advantage of the flexibility inherent in the three-step test. Instead of using the international provision as a means to open up the closed EC catalogue of permissible exceptions, offer sufficient breathing space for social, cultural, and economic needs, and enable EC copyright law to keep pace with the rapid development of the Internet, the Copyright Directive 2001/29/EC encourages the application of the three-step test to further restrict statutory exceptions that are often defined narrowly in national legislation anyway. In the current online environment, however, enhanced flexibility in the field of copyright limitations is indispensable. From a social and cultural perspective, the web 2.0 promotes and enhances freedom of expression and information with its advanced search engine services, interactive platforms, and various forms of user-generated content. From an economic perspective, it creates a parallel universe of traditional content providers relying on copyright protection, and emerging Internet industries whose further development depends on robust copyright limita- tions. In particular, the newcomers in the online market – social networking sites, video forums, and virtual worlds – promise a remarkable potential for economic growth that has already attracted the attention of the OECD. Against this background, the time is ripe to debate the introduction of an EC fair use doctrine on the basis of the three-step test. Otherwise, EC copyright law is likely to frustrate important opportunities for cultural, social, and economic development. To lay groundwork for the debate, the differences between the continental European and the Anglo-American approach to copyright limitations (section 1), and the specific merits of these two distinct approaches (section 2), will be discussed first. An analysis of current problems that have arisen under the present dysfunctional EC system (section 3) will then serve as a starting point for proposing an EC fair use doctrine based on the three-step test (section 4). Drawing conclusions, the international dimension of this fair use proposal will be considered (section 5).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The execution of a project requires resources that are generally scarce. Classical approaches to resource allocation assume that the usage of these resources by an individual project activity is constant during the execution of that activity; in practice, however, the project manager may vary resource usage over time within prescribed bounds. This variation gives rise to the project scheduling problem which consists in allocating the scarce resources to the project activities over time such that the project duration is minimized, the total number of resource units allocated equals the prescribed work content of each activity, and various work-content-related constraints are met. We formulate this problem for the first time as a mixed-integer linear program. Our computational results for a standard test set from the literature indicate that this model outperforms the state-of-the-art solution methods for this problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A water desaturation zone develops around a tunnel in water-saturated rock when the evaporative water loss at the rock surface is larger than the water flow from the surrounding saturated region of restricted permeability. We describe the methods with which such water desaturation processes in rock materials can be quantified. The water retention characteristic theta(psi) of crystalline rock samples was determined with a pressure membrane apparatus. The negative water potential, identical to the capillary pressure, psi, below the tensiometric range (psi < -0.1 MPa) can be measured with thermocouple psychrometers (TP), and the volumetric water contents, theta, by means of time domain reflectometry (TDR). These standard methods were adapted for measuring the water status in a macroscopically unfissured granodiorite with a total porosity of approximately 0.01. The measured water retention curve of granodiorite samples from the Grimsel test site (central Switzerland) exhibits a shape which is typical for bimodal pore size distributions. The measured bimodality is probably an artifact of a large surface ratio of solid/voids. The thermocouples were installed without a metallic screen using the cavity drilled into the granodiorite as a measuring chamber. The water potentials observed in a cylindrical granodiorite monolith ranged between -0.1 and -3.0 MPa; those near the wall in a ventilated tunnel between -0.1 and -2.2 MPa. Two types of three-rod TDR Probes were used, one as a depth probe inserted into the rock, the other as a surface probe using three copper stripes attached to the surface for detecting water content changes in the rock-to-air boundary. The TDR signal was smoothed with a low-pass filter, and the signal length determined based on the first derivative of the trace. Despite the low porosity of crystalline rock these standard methods are applicable to describe the unsaturated zone in solid rock and may also be used in other consolidated materials such as concrete.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purposes of this study were to examine (1) the relationship between selected components of the content of prenatal care and spontaneous preterm birth; and (2) the degree of comparability between maternal and caregivers' responses regarding the number of prenatal care visits, selected components of the content of prenatal care, and gestational age, based on analyses of the 1988 National Maternal and Infant Health Survey conducted by the National Centers for Health Statistics. Spontaneous preterm birth was subcategorized into very preterm and moderately preterm births, with term birth as the controls. The study population was limited to non-Hispanic Anglo- and African-American mothers. The racial differences in terms of birth outcomes were also compared.^ This study concluded that: (1) there was not a high degree of comparability (less than 80%) between maternal and prenatal care provider's responses regarding the number of prenatal care visits and the content of prenatal care; (2) there was a low degree of comparability (less than 50%) between maternal and infant's hospital of delivery responses regarding gestational age at birth; (3) there were differences in selected components of the content of prenatal care between the cases and controls, overall and stratified by ethnicity (i.e., hemoglobin/hematocrit test, weight measurement, and breast-feeding counseling), but they were confounded with missing values and associated preterm delivery bias; (4) there were differences in selected components of the content of prenatal care between Anglo- and African-American cases (i.e., vitamin/mineral supplement advice, weight measurement, smoking cessation and drug abuse counseling), but they, too, were difficult to interpret definitively due to item nonresponse and preterm delivery biases; (5) no significant predictive association between selected components of the content of prenatal care and spontaneous preterm birth was found; and (6) inadequate/intermediate prenatal care and birth out of wedlock were found to be associated with moderately preterm birth.^ Future research is needed to examine the validity of maternal and prenatal care providers' responses and identify the sources of disagreement between their responses. In addition, further studies are needed to examine the relationship between the quality of prenatal care and preterm birth. Finally, the completeness and quality of patient and provider data on the utilization and content of prenatal care needs to be strengthened in subsequent studies. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This multi-phase study examined the influence of retrieval processes on children’s metacognitive processes in relation to and in interaction with achievement level and age. First, N = 150 9/10- and 11/12-year old high and low achievers watched an educational film and predicted their test performance. Children then solved a cloze test regarding the film content including answerable and unanswerable items and gave confidence judgments to every answer. Finally, children withdrew answers that they believed to be incorrect. All children showed adequate metacognitive processes before and during test taking with 11/12- year-olds outperforming 9/10-year-olds when considering characteristics of on-going retrieval processes. As to the influence of achievement level, high compared to low achievers proved to be more accurate in their metacognitive monitoring and controlling. Results suggest that both cognitive resources (operationalized through achievement level) and mnemonic experience (assessed through age) fuel metacognitive development. Nevertheless, when facing higher demands regarding retrieval processes, experience seems to play the more important role.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study analyzed the influence of the occupational context on the conceptualization of career satisfaction measured by the career satisfaction scale (CSS). In a large sample of N ¼ 729 highly educated professionals, a cross-occupational (i.e., physicians, economists, engineers, and teachers) measurement invariance analysis showed that the CSS was conceptualized according to occupational group membership, that is, 4 of the 5 items of the scale showed measurement noninvariance. More specifically, the relative importance, the response biases, and the reliabilities associated with different career satisfaction content domains measured by the CSS (i.e., achieved success, overall career goals, goals for advancement, goals for income, and goals for development of new skills) varied by occupational context. However, results of a comparison between manifest and latent mean differences between the occupational groups revealed that the observed measurement noninvariance did not affect the estimation of mean differences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chronic aerobic exercise has been shown to increase exercise efficiency, thus allowing less energy expenditure for a similar amount of work. The extent to which skeletal muscle mitochondria play a role in this is not fully understood, particularly in an elderly population. The purpose of this study was to determine the relationship of exercise efficiency with mitochondrial content and function. We hypothesized that the greater the mitochondrial content and/or function, the greater would be the efficiencies. Thirty-eight sedentary (S, n = 23, 10F/13M) or athletic (A, n = 15, 6F/9M) older adults (66.8 ± 0.8 years) participated in this cross sectional study. V˙O2peak was measured with a cycle ergometer graded exercise protocol (GXT). Gross efficiency (GE, %) and net efficiency (NE, %) were estimated during a 1-h submaximal test (55% V˙O2peak). Delta efficiency (DE, %) was calculated from the GXT. Mitochondrial function was measured as ATPmax (mmol/L/s) during a PCr recovery protocol with (31)P-MR spectroscopy. Muscle biopsies were acquired for determination of mitochondrial volume density (MitoVd, %). Efficiencies were 17% (GE), 14% (NE), and 16% (DE) higher in A than S. MitoVD was 29% higher in A and ATPmax was 24% higher in A than in S. All efficiencies positively correlated with both ATPmax and MitoVd. Chronically trained older individuals had greater mitochondrial content and function, as well as greater exercise efficiencies. GE, NE, and DE were related to both mitochondrial content and function. This suggests a possible role of mitochondria in improving exercise efficiency in elderly athletic populations and allowing conservation of energy at moderate workloads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Five test runs were performed to assess possible bias when performing the loss on ignition (LOI) method to estimate organic matter and carbonate content of lake sediments. An accurate and stable weight loss was achieved after 2 h of burning pure CaCO3 at 950 °C, whereas LOI of pure graphite at 530 °C showed a direct relation to sample size and exposure time, with only 40-70% of the possible weight loss reached after 2 h of exposure and smaller samples losing weight faster than larger ones. Experiments with a standardised lake sediment revealed a strong initial weight loss at 550 °C, but samples continued to lose weight at a slow rate at exposure of up to 64 h, which was likely the effect of loss of volatile salts, structural water of clay minerals or metal oxides, or of inorganic carbon after the initial burning of organic matter. A further test-run revealed that at 550 °C samples in the centre of the furnace lost more weight than marginal samples. At 950 °C this pattern was still apparent but the differences became negligible. Again, LOI was dependent on sample size. An analytical LOI quality control experiment including ten different laboratories was carried out using each laboratory's own LOI procedure as well as a standardised LOI procedure to analyse three different sediments. The range of LOI values between laboratories measured at 550 °C was generally larger when each laboratory used its own method than when using the standard method. This was similar for 950 °C, although the range of values tended to be smaller. The within-laboratory range of LOI measurements for a given sediment was generally small. Comparisons of the results of the individual and the standardised method suggest that there is a laboratory-specific pattern in the results, probably due to differences in laboratory equipment and/or handling that could not be eliminated by standardising the LOI procedure. Factors such as sample size, exposure time, position of samples in the furnace and the laboratory measuring affected LOI results, with LOI at 550 °C being more susceptible to these factors than LOI at 950 °C. We, therefore, recommend analysts to be consistent in the LOI method used in relation to the ignition temperatures, exposure times, and the sample size and to include information on these three parameters when referring to the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In France, farmers commission about 250,000 soil-testing analyses per year to assist them managing soil fertility. The number and diversity of origin of the samples make these analyses an interesting and original information source regarding cultivated topsoil variability. Moreover, these analyses relate to several parameters strongly influenced by human activity (macronutrient contents, pH...), for which existing cartographic information is not very relevant. Compiling the results of these analyses into a database makes it possible to re-use these data within both a national and temporal framework. A database compilation relating to data collected over the period 1990-2009 has been recently achieved. So far, commercial soil-testing laboratories approved by the Ministry of Agriculture have provided analytical results from more than 2,000,000 samples. After the initial quality control stage, analytical results from more than 1,900,000 samples were available in the database. The anonymity of the landholders seeking soil analyses is perfectly preserved, as the only identifying information stored is the location of the nearest administrative city to the sample site. We present in this dataset a set of statistical parameters of the spatial distributions for several agronomic soil properties. These statistical parameters are calculated for 4 different nested spatial entities (administrative areas: e.g. regions, departments, counties and agricultural areas) and for 4 time periods (1990-1994, 1995-1999, 2000-2004, 2005-2009). Two kinds of agronomic soil properties are available: the firs one correspond to the quantitative variables like the organic carbon content and the second one corresponds to the qualitative variables like the texture class. For each spatial unit and temporal period, we calculated the following statistics stets: the first set is calculated for the quantitative variables and corresponds to the number of samples, the mean, the standard deviation and, the 2-,4-,10-quantiles; the second set is calculated for the qualitative variables and corresponds to the number of samples, the value of the dominant class, the number of samples of the dominant class, the second dominant class, the number of samples of the second dominant class.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigated surface and deep ocean variability in the subpolar North Atlantic from 1000 to 500 thousand years ago (ka) based on two Ocean Drilling Program (ODP) sites, Feni drift site 980 (55°29'N, 14°42'W) and Bjorn drift site 984 (61°25'N, 24°04'W). Benthic foraminiferal stable isotope data, planktic foraminiferal faunas, ice-rafted debris data, and faunally based sea-surface temperature estimates help test the hypothesis that oceanographic changes in the North Atlantic region were associated with the onset of the 100-kyr world during the mid-Pleistocene revolution. Based on percentage of Neogloboquadrina pachyderma (s) records from both sites, surface waters during interglacials and glacials were cooler in the mid-Pleistocene than during marine isotope stages (MIS) 5 and 6. In particular, interglaciations at Bjorn drift site 984 were significantly cooler. Faunal evidence suggests that the interglacial Arctic front shifted from a position between the two sites to a position northwest of Bjorn drift site 984 after ca. 610 ka. As during the late Pleistocene, we find faunal evidence for lagging surface warmth at most of the glacial initiations during the mid-Pleistocene. Each initiation is associated with high benthic d13C values that are maintained into the succeeding glaciation, which we term "lagging NADW production." These findings indicate that lagging warmth and lagging NADW production are robust features of the regional climate system that persist in the middle to late Pleistocene.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Major ice sheets were permanently established on Antarctica approximately 34 million years ago, close to the Eocene/ Oligocene boundary, at the same time as a permanent deepening of the calcite compensation depth in the world's oceans. Until recently, it was thought that Northern Hemisphere glaciation began much later, between 11 and 5million years ago. This view has been challenged, however, by records of ice rafting at high northern latitudes during the Eocene epoch and by estimates of global ice volume that exceed the storage capacity of Antarctica at the same time as a temporary deepening of the calcite compensation depth 41.6 million years ago. Here we test the hypothesis that large ice sheets were present in both hemispheres 41.6 million years ago using marine sediment records of oxygen and carbon isotope values and of calcium carbonate content from the equatorial Atlantic Ocean. These records allow, at most, an ice budget that can easily be accommodated on Antarctica, indicating that large ice sheets were not present in the Northern Hemisphere. The records also reveal a brief interval shortly before the temporary deepening of the calcite compensation depth during which the calcite compensation depth shoaled, ocean temperatures increased and carbon isotope values decreased in the equatorial Atlantic. The nature of these changes around 41.6 million years ago implies common links, in terms of carbon cycling, with events at the Eocene/Oligocene boundary and with the 'hyperthermals' of the Early Eocene climate optimum. Our findings help to resolve the apparent discrepancy between the geological records of Northern Hemisphere glaciation and model results that indicate that the threshold for continental glaciation was crossed earlier in the Southern Hemisphere than in the Northern Hemisphere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Efforts to evaluate the response of coral larvae to global climate change (GCC) and ocean acidification (OA) typically employ short experiments of fixed length, yet it is unknown how the response is affected by exposure duration. In this study, we exposed larvae from the brooding coral Pocillopora damicornis to contrasts of temperature (24.00 °C [ambient] versus 30.49 °C) and pCO2 (49.4 Pa versus 86.2 Pa) for varying periods (1-5 days) to test the hypothesis that exposure duration had no effect on larval response as assessed by protein content, respiration, Symbiodinium density, and survivorship; exposure times were ecologically relevant compared to representative pelagic larval durations (PLD) for corals. Larvae differed among days for all response variables, and the effects of the treatment were relatively consistent regardless of exposure duration for three of the four response variables. Protein content and Symbiodinium density were unaffected by temperature and pCO2, but respiration increased with temperature (but not pCO2) with the effect intensifying as incubations lengthened. Survival, however, differed significantly among treatments at the end of the study, and by the 5th day, 78% of the larvae were alive and swimming under ambient temperature and ambient pCO2, but only 55-59% were alive in the other treatments. These results demonstrate that the physiological effects of temperature and pCO2 on coral larvae can reliably be detected within days, but effects on survival require > or = 5 days to detect. The detection of time-dependent effects on larval survivorship suggests that the influence of GCC and OA will be stronger for corals having long PLDs.