971 resultados para it use


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contributed to: Fusion of Cultures. XXXVIII Annual Conference on Computer Applications and Quantitative Methods in Archaeology – CAA2010 (Granada, Spain, Apr 6-9, 2010)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Screen-viewing has been associated with increased body mass, increased risk of metabolic syndrome and lower psychological well-being among children and adolescents. There is a shortage of information about the nature of contemporary screen-viewing amongst children especially given the rapid advances in screen-viewing equipment technology and their widespread availability. Anecdotal evidence suggests that large numbers of children embrace the multi-functionality of current devices to engage in multiple forms of screen-viewing at the same time. In this paper we used qualitative methods to assess the nature and extent of multiple forms of screen-viewing in UK children. Methods: Focus groups were conducted with 10-11 year old children (n = 63) who were recruited from five primary schools in Bristol, UK. Topics included the types of screen-viewing in which the participants engaged; whether the participants ever engaged in more than one form of screen-viewing at any time and if so the nature of this multiple viewing; reasons for engaging in multi-screen-viewing; the room within the house where multi-screen-viewing took place and the reasons for selecting that room. All focus groups were transcribed verbatim, anonymised and thematically analysed. Results: Multi-screen viewing was a common behaviour. Although multi-screen viewing often involved watching TV, TV viewing was often the background behaviour with attention focussed towards a laptop, handheld device or smart-phone. There were three main reasons for engaging in multi-screen viewing: 1) tempering impatience that was associated with a programme loading; 2) multi-screen facilitated filtering out unwanted content such as advertisements; and 3) multi-screen viewing was perceived to be enjoyable. Multi-screen viewing occurred either in the child's bedroom or in the main living area of the home. There was considerable variability in the level and timing of viewing and this appeared to be a function of whether the participants attended after-school clubs. Conclusions: UK children regularly engage in two or more forms of screen-viewing at the same time. There are currently no means of assessing multi-screen viewing nor any interventions that specifically focus on reducing multi-screen viewing. To reduce children's overall screen-viewing we need to understand and then develop approaches to reduce multi-screen viewing among children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aboriginal peoples in Canada have been mapping aspects of their cultures for more than a generation. Indians, Inuit, Métis, non-status Indians and others have called their maps by different names at various times and places: land use and occupancy; land occupancy and use; traditional use; traditional land use and occupancy; current use; cultural sensitive areas; and so on. I use “land use and occupancy mapping” in a generic sense to include all the above. The term refers to the collection of interview data about traditional use of resources and occupancy of lands by First Nation persons, and the presentation of those data in map form. Think of it as the geography of oral tradition, or as the mapping of cultural and resource geography. (PDF contains 81 pages.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objectives of this report, which is based on the current literature and key informant interviews, is to assess and analyse the nature and distribution of poverty and aquatic resources use, focusing especially on the livelihoods of the poor. It describes and reports different ways of measuring poverty that are used in Cambodia and quantifies the diverse nature and geographic distribution of aquatic resources use in Cambodia. (PDF contains 55 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On 23-24 September 2009 an international discussion workshop on “Main Drivers for Successful Re-Use of Research Data” was held in Berlin, prepared and organised by the Knowledge Exchange working group on Primary Research Data. The main focus of the workshop was on the benefits, challenges and obstacles of re-using data from a researcher’s perspective. The use cases presented by researchers from a variety of disciplines were supplemented by two key notes and selected presentations by specialists from infrastructure institutions, publishers, and funding bodies on national and European level. Researchers' perspectives The workshop provided a critical evaluation of what lessons have been learned on sharing and re-using research data from a researcher’s perspective and what actions might be taken on to still improve the successful re-use. Despite the individual differences characterising the diverse disciplines it became clear that important issues are comparable. Combine forces to support re-use and sharing of data Apart from several technical challenges such as metadata exchange standards and quality assurance it was obvious that the most important obstacles to re-using research data more efficiently are socially determined. It was agreed that in order to overcome this problem more efforts should be made to rise awareness and combine forces to support re-using and sharing of research data on all levels (researchers, institutions, publishers, funders, governments).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the language choice behavior of bilingual speakers in modern societies, such as the Basque Country, Ireland andWales. These countries have two o cial languages:A, spoken by all, and B, spoken by a minority. We think of the bilinguals in those societies as a population playing repeatedly a Bayesian game in which, they must choose strategically the language, A or B, that might be used in the interaction. The choice has to be made under imperfect information about the linguistic type of the interlocutors. We take the Nash equilibrium of the language use game as a model for real life language choice behavior. It is shown that the predictions made with this model t very well the data about the actual use, contained in the censuses, of Basque, Irish and Welsh languages. Then the question posed by Fishman (2001),which appears in the title, is answered as follows: it is hard, mainly, because bilingual speakers have reached an equilibrium which is evolutionary stable. This means that to solve fast and in a re ex manner their frequent language coordination problem, bilinguals have developed linguistic conventions based chie y on the strategy 'Use the same language as your interlocutor', which weakens the actual use of B.1

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sisal hemp (Agave sisalana) leaves were harvested and processed using the beating and decomposition methods. The fibres obtained were washed, dried and finally spurned in to cordage of about 4mm diameter 39 pieces of ropes, each measuring 2 meters were altogether spurned. 30 pieces of these ropes were immersed in water for a period of 24 weeks, 6 were placed in a shaded and airily place and 3 were used for the head and footling of gillnet, sinker line of cast net and the main line of long line. Every other week, the ropes in water and air were tested for its breaking strength using an improved 50kg spring balance. At the end of the experiment, it was found the immersed ropes maintained a tensile strength of over 50kg/F for the first 18 weeks, thereafter; there was a gradual weekly reduction in the strength until the 23rd week when the tensile strength was less than 1kg/F. The cost benefit analysis showed that about 5,3146 tons processed fibers could be obtained fro 1ha. capable of being spenced in to 528300m of 4mm diameter cordage. This paper finally recommended the growth of sisal hemp plants by fisher folks so that there will be constant stock for intermittent harvesting for rope spurning

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] This paper is an outcome of the following dissertation:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fishery of Lake Victoria became a major commercial fishery with the introduction of Nile perch in 1950s and 1960s. Biological and population characteristics point to a fishery under intense fishing pressure attributed to increased capacity and use of illegal fishing gears. Studies conducted between 1998 to 2000 suggested capture of fish between slot size of 50 to 85 cm TL to sustain the fishery. Samples from Kenya and Uganda factories in 2008 showed that 50% and 71% of individuals processed were below the slot size respectively. This study revealed that fish below and above the slot has continued being caught and processed. This confirms that the slot size is hardly adhered to by both the fishers and the processors. The paper explores why the slot size has not been a successful tool in management of Nile perch and suggests strategies to sustain the fishery

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The initial objective of Part I was to determine the nature of upper mantle discontinuities, the average velocities through the mantle, and differences between mantle structure under continents and oceans by the use of P'dP', the seismic core phase P'P' (PKPPKP) that reflects at depth d in the mantle. In order to accomplish this, it was found necessary to also investigate core phases themselves and their inferences on core structure. P'dP' at both single stations and at the LASA array in Montana indicates that the following zones are candidates for discontinuities with varying degrees of confidence: 800-950 km, weak; 630-670 km, strongest; 500-600 km, strong but interpretation in doubt; 350-415 km, fair; 280-300 km, strong, varying in depth; 100-200 km, strong, varying in depth, may be the bottom of the low-velocity zone. It is estimated that a single station cannot easily discriminate between asymmetric P'P' and P'dP' for lead times of about 30 sec from the main P'P' phase, but the LASA array reduces this uncertainty range to less than 10 sec. The problems of scatter of P'P' main-phase times, mainly due to asymmetric P'P', incorrect identification of the branch, and lack of the proper velocity structure at the velocity point, are avoided and the analysis shows that one-way travel of P waves through oceanic mantle is delayed by 0.65 to 0.95 sec relative to United States mid-continental mantle.

A new P-wave velocity core model is constructed from observed times, dt/dΔ's, and relative amplitudes of P'; the observed times of SKS, SKKS, and PKiKP; and a new mantle-velocity determination by Jordan and Anderson. The new core model is smooth except for a discontinuity at the inner-core boundary determined to be at a radius of 1215 km. Short-period amplitude data do not require the inner core Q to be significantly lower than that of the outer core. Several lines of evidence show that most, if not all, of the arrivals preceding the DF branch of P' at distances shorter than 143° are due to scattering as proposed by Haddon and not due to spherically symmetric discontinuities just above the inner core as previously believed. Calculation of the travel-time distribution of scattered phases and comparison with published data show that the strongest scattering takes place at or near the core-mantle boundary close to the seismic station.

In Part II, the largest events in the San Fernando earthquake series, initiated by the main shock at 14 00 41.8 GMT on February 9, 1971, were chosen for analysis from the first three months of activity, 87 events in all. The initial rupture location coincides with the lower, northernmost edge of the main north-dipping thrust fault and the aftershock distribution. The best focal mechanism fit to the main shock P-wave first motions constrains the fault plane parameters to: strike, N 67° (± 6°) W; dip, 52° (± 3°) NE; rake, 72° (67°-95°) left lateral. Focal mechanisms of the aftershocks clearly outline a downstep of the western edge of the main thrust fault surface along a northeast-trending flexure. Faulting on this downstep is left-lateral strike-slip and dominates the strain release of the aftershock series, which indicates that the downstep limited the main event rupture on the west. The main thrust fault surface dips at about 35° to the northeast at shallow depths and probably steepens to 50° below a depth of 8 km. This steep dip at depth is a characteristic of other thrust faults in the Transverse Ranges and indicates the presence at depth of laterally-varying vertical forces that are probably due to buckling or overriding that causes some upward redirection of a dominant north-south horizontal compression. Two sets of events exhibit normal dip-slip motion with shallow hypocenters and correlate with areas of ground subsidence deduced from gravity data. Several lines of evidence indicate that a horizontal compressional stress in a north or north-northwest direction was added to the stresses in the aftershock area 12 days after the main shock. After this change, events were contained in bursts along the downstep and sequencing within the bursts provides evidence for an earthquake-triggering phenomenon that propagates with speeds of 5 to 15 km/day. Seismicity before the San Fernando series and the mapped structure of the area suggest that the downstep of the main fault surface is not a localized discontinuity but is part of a zone of weakness extending from Point Dume, near Malibu, to Palmdale on the San Andreas fault. This zone is interpreted as a decoupling boundary between crustal blocks that permits them to deform separately in the prevalent crustal-shortening mode of the Transverse Ranges region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ritterazine and cephalostatin natural products have biological activities and structures that are interesting to synthetic organic chemists. These products have been found to exhibit significant cytotoxicity against P388 murine leukemia cells, and therefore have the potential to be used as anticancer drugs. The ritterazines and cephalostatins are steroidal dimers joined by a central pyrazine ring. Given that the steroid halves are unsymmetrical and highly oxygenated, there are several challenges in synthesizing these compounds in an organic laboratory.

Ritterazine B is the most potent derivative in the ritterazine family. Its biological activity is comparable to drugs that are being used to treat cancer today. For this reason, and the fact that there are no reported syntheses of ritterazine B to date, our lab set out to synthesize this natural product.

Herein, efforts toward the synthesis of the western fragment of ritterazine B are described. Two different routes are explored to access a common intermediate. An alkyne conjugate addition reaction was initially investigated due to the success of this key reaction in the synthesis of the eastern fragment. However, it has been found that a propargylation reaction has greater reactivity and yields, and has the potential to reduce the step count of the synthesis of the western fragment of ritterazine B.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Energy and sustainability have become one of the most critical issues of our generation. While the abundant potential of renewable energy such as solar and wind provides a real opportunity for sustainability, their intermittency and uncertainty present a daunting operating challenge. This thesis aims to develop analytical models, deployable algorithms, and real systems to enable efficient integration of renewable energy into complex distributed systems with limited information.

The first thrust of the thesis is to make IT systems more sustainable by facilitating the integration of renewable energy into these systems. IT represents the fastest growing sectors in energy usage and greenhouse gas pollution. Over the last decade there are dramatic improvements in the energy efficiency of IT systems, but the efficiency improvements do not necessarily lead to reduction in energy consumption because more servers are demanded. Further, little effort has been put in making IT more sustainable, and most of the improvements are from improved "engineering" rather than improved "algorithms". In contrast, my work focuses on developing algorithms with rigorous theoretical analysis that improve the sustainability of IT. In particular, this thesis seeks to exploit the flexibilities of cloud workloads both (i) in time by scheduling delay-tolerant workloads and (ii) in space by routing requests to geographically diverse data centers. These opportunities allow data centers to adaptively respond to renewable availability, varying cooling efficiency, and fluctuating energy prices, while still meeting performance requirements. The design of the enabling algorithms is however very challenging because of limited information, non-smooth objective functions and the need for distributed control. Novel distributed algorithms are developed with theoretically provable guarantees to enable the "follow the renewables" routing. Moving from theory to practice, I helped HP design and implement industry's first Net-zero Energy Data Center.

The second thrust of this thesis is to use IT systems to improve the sustainability and efficiency of our energy infrastructure through data center demand response. The main challenges as we integrate more renewable sources to the existing power grid come from the fluctuation and unpredictability of renewable generation. Although energy storage and reserves can potentially solve the issues, they are very costly. One promising alternative is to make the cloud data centers demand responsive. The potential of such an approach is huge.

To realize this potential, we need adaptive and distributed control of cloud data centers and new electricity market designs for distributed electricity resources. My work is progressing in both directions. In particular, I have designed online algorithms with theoretically guaranteed performance for data center operators to deal with uncertainties under popular demand response programs. Based on local control rules of customers, I have further designed new pricing schemes for demand response to align the interests of customers, utility companies, and the society to improve social welfare.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A real-time, in situ fixing method by use of heating with a CO2 laser beam is suggested for thermal fixing of a small local hologram in the bulk of a Fe:LiNbO3 photorefractive crystal. For heating up to 100 degrees C-200 degrees C a volume with a shape similar to that of the laser beam a heat-guiding technique is developed. On the basis of the heat-transfer equations, different heating modes with or without metal absorbers for heat guiding-obtained by use of a continuous or pulsed laser beam are analyzed. The optimal mode may be pulsed heating with absorbers. On this basis experiments have been designed and demonstrated. It is seen that the fixing process with CO2 laser beam is short compared with the process by use of an oven, and the fixing efficiency is quite high. (C) 1998 Optical Society of America.