931 resultados para IT outsourcing
Resumo:
Last year, Jisc began work with EDUCAUSE - the US organisation for IT professionals in higher education - to find out the skillset of the CIO of the future. One of the findings of our project was that many aspiring technology leaders find it difficult to make the step up. Louisa Dale, director Jisc group sector intelligence, talks through the learnings and opens a call for IT professionals to get involved in the next phase of work.
Resumo:
Case study on how 16 - 18 year old students at Portsmouth College have access to an iPad mini to support independent and personalised learning.
Resumo:
Case study on how South Eastern Regional College are taking a strategic approach to managing and developing digital technologies to enhance the student experience.
Resumo:
This paper studies the feasibility of calculating strains in aged F114 steel specimens with Fiber Bragg Grating (FBG) sensors and infrared thermography (IT) techniques. Two specimens have been conditioned under extreme temperature and relative humidity conditions making comparative tests of stress before and after aging using different adhesives. Moreover, a comparison has been made with IT tecniques and conventional methods for calculating stresses in F114 steel. Implementation of Structural Health Monitoring techniques on real aircraft during their life cycle requires a study of the behaviour of FBG sensors and their wiring under real conditions, before using them for a long time. To simulate aging, specimens were stored in a climate chamber at 70 degrees C and 90% RH for 60 days. This study is framed within the Structural Health Monitoring (SHM) and Non Destructuve Evaluation (NDE) research lines, integrated into the avionics area maintained by the Aeronautical Technologies Centre (CTA) and the University of the Basque Country (UPV/EHU).
Resumo:
We study the language choice behavior of bilingual speakers in modern societies, such as the Basque Country, Ireland andWales. These countries have two o cial languages:A, spoken by all, and B, spoken by a minority. We think of the bilinguals in those societies as a population playing repeatedly a Bayesian game in which, they must choose strategically the language, A or B, that might be used in the interaction. The choice has to be made under imperfect information about the linguistic type of the interlocutors. We take the Nash equilibrium of the language use game as a model for real life language choice behavior. It is shown that the predictions made with this model t very well the data about the actual use, contained in the censuses, of Basque, Irish and Welsh languages. Then the question posed by Fishman (2001),which appears in the title, is answered as follows: it is hard, mainly, because bilingual speakers have reached an equilibrium which is evolutionary stable. This means that to solve fast and in a re ex manner their frequent language coordination problem, bilinguals have developed linguistic conventions based chie y on the strategy 'Use the same language as your interlocutor', which weakens the actual use of B.1
Resumo:
The fishery of Lake Victoria became a major commercial fishery with the introduction of Nile perch in 1950s and 1960s. Biological and population characteristics point to a fishery under intense fishing pressure attributed to increased capacity and use of illegal fishing gears. Studies conducted between 1998 to 2000 suggested capture of fish between slot size of 50 to 85 cm TL to sustain the fishery. Samples from Kenya and Uganda factories in 2008 showed that 50% and 71% of individuals processed were below the slot size respectively. This study revealed that fish below and above the slot has continued being caught and processed. This confirms that the slot size is hardly adhered to by both the fishers and the processors. The paper explores why the slot size has not been a successful tool in management of Nile perch and suggests strategies to sustain the fishery
Resumo:
The search for reliable proxies of past deep ocean temperature and salinity has proved difficult, thereby limiting our ability to understand the coupling of ocean circulation and climate over glacial-interglacial timescales. Previous inferences of deep ocean temperature and salinity from sediment pore fluid oxygen isotopes and chlorinity indicate that the deep ocean density structure at the Last Glacial Maximum (LGM, approximately 20,000 years BP) was set by salinity, and that the density contrast between northern and southern sourced deep waters was markedly greater than in the modern ocean. High density stratification could help explain the marked contrast in carbon isotope distribution recorded in the LGM ocean relative to that we observe today, but what made the ocean's density structure so different at the LGM? How did it evolve from one state to another? Further, given the sparsity of the LGM temperature and salinity data set, what else can we learn by increasing the spatial density of proxy records?
We investigate the cause and feasibility of a highly and salinity stratified deep ocean at the LGM and we work to increase the amount of information we can glean about the past ocean from pore fluid profiles of oxygen isotopes and chloride. Using a coupled ocean--sea ice--ice shelf cavity model we test whether the deep ocean density structure at the LGM can be explained by ice--ocean interactions over the Antarctic continental shelves, and show that a large contribution of the LGM salinity stratification can be explained through lower ocean temperature. In order to extract the maximum information from pore fluid profiles of oxygen isotopes and chloride we evaluate several inverse methods for ill-posed problems and their ability to recover bottom water histories from sediment pore fluid profiles. We demonstrate that Bayesian Markov Chain Monte Carlo parameter estimation techniques enable us to robustly recover the full solution space of bottom water histories, not only at the LGM, but through the most recent deglaciation and the Holocene up to the present. Finally, we evaluate a non-destructive pore fluid sampling technique, Rhizon samplers, in comparison to traditional squeezing methods and show that despite their promise, Rhizons are unlikely to be a good sampling tool for pore fluid measurements of oxygen isotopes and chloride.
Resumo:
Energy and sustainability have become one of the most critical issues of our generation. While the abundant potential of renewable energy such as solar and wind provides a real opportunity for sustainability, their intermittency and uncertainty present a daunting operating challenge. This thesis aims to develop analytical models, deployable algorithms, and real systems to enable efficient integration of renewable energy into complex distributed systems with limited information.
The first thrust of the thesis is to make IT systems more sustainable by facilitating the integration of renewable energy into these systems. IT represents the fastest growing sectors in energy usage and greenhouse gas pollution. Over the last decade there are dramatic improvements in the energy efficiency of IT systems, but the efficiency improvements do not necessarily lead to reduction in energy consumption because more servers are demanded. Further, little effort has been put in making IT more sustainable, and most of the improvements are from improved "engineering" rather than improved "algorithms". In contrast, my work focuses on developing algorithms with rigorous theoretical analysis that improve the sustainability of IT. In particular, this thesis seeks to exploit the flexibilities of cloud workloads both (i) in time by scheduling delay-tolerant workloads and (ii) in space by routing requests to geographically diverse data centers. These opportunities allow data centers to adaptively respond to renewable availability, varying cooling efficiency, and fluctuating energy prices, while still meeting performance requirements. The design of the enabling algorithms is however very challenging because of limited information, non-smooth objective functions and the need for distributed control. Novel distributed algorithms are developed with theoretically provable guarantees to enable the "follow the renewables" routing. Moving from theory to practice, I helped HP design and implement industry's first Net-zero Energy Data Center.
The second thrust of this thesis is to use IT systems to improve the sustainability and efficiency of our energy infrastructure through data center demand response. The main challenges as we integrate more renewable sources to the existing power grid come from the fluctuation and unpredictability of renewable generation. Although energy storage and reserves can potentially solve the issues, they are very costly. One promising alternative is to make the cloud data centers demand responsive. The potential of such an approach is huge.
To realize this potential, we need adaptive and distributed control of cloud data centers and new electricity market designs for distributed electricity resources. My work is progressing in both directions. In particular, I have designed online algorithms with theoretically guaranteed performance for data center operators to deal with uncertainties under popular demand response programs. Based on local control rules of customers, I have further designed new pricing schemes for demand response to align the interests of customers, utility companies, and the society to improve social welfare.
Resumo:
O processo de terciarização da economia, o qual se faz presente claramente desde os primórdios da cidade do Rio de Janeiro, promoveu profundas transformações na organização interna da cidade, o que implica em novos usos do espaço urbano e gera novos processos na cidade, sobretudo em nossa área de estudo, o bairro de Botafogo. O processo de globalização provoca mudanças significativas na economia mundial. A cidade, portanto se insere nos espaços de globalização e, assim, sofre mudanças que vão se refletir em sua estrutura urbana. Desta forma, o setor de serviços é dinamizado, reforçando, então, o processo de terciarização. Neste caso, se torna evidente a relação entre urbanização as atividades terciárias. Percebe-se, assim, o surgimento de um novo modelo de urbanização, atrelado à presença destas atividades terciárias, as quais serão de suma importância para a constituição de novas centralidades no espaço interno da cidade, como é o caso de Botafogo. Neste contexto, os espaços de cultura e lazer têm fundamental importância, visto que atraem fluxos de consumidores, os quais se aproveitam da infraestrutura proporcionada pelo Estado, sobretudo de transportes, através do metrô e das inúmeras linhas de ônibus que servem o bairro. Os espaços de cultura e lazer possuem uma grande importância no contexto de renovação de Botafogo, uma vez que o bairro apresenta uma grande concentração de equipamentos desta natureza, os quais contribuem para uma maior circulação de pessoas em seu espaço interno. Observam-se duas áreas de concentração destes espaços em Botafogo, as quais se localizam nas extremidades e adjacências do principal eixo viário do bairro, onde podemos encontrar também onde se concentram serviços especializados. Então, podemos dizer que, na esfera do consumo, estes espaços de cultura e lazer surgem como resultado de ações públicas e privadas. Os espaços de cultura e lazer promovem uma nova dinâmica a Botafogo e gera impactos na economia urbana do Rio de Janeiro. São responsáveis pelo surgimento de uma nova centralidade no âmbito da cidade e contribuem, portanto, para o processo de reestruturação urbana da cidade do Rio de Janeiro