321 resultados para developer
Resumo:
In the tutorial we explored the PayPal API as an example of an API that implements HATEOAS.
Resumo:
Virginia Apgar (1909-1974) is one of the most recognized American doctors, worldwide known by his contribution as the developer of the "Apgar test" a method used for the evaluation of newborns all over the world. She had many interests. She was anesthesiologist, a brilliant teacher and researcher, but she also loved lecture, basketball, fishing, golf, philately, and music. She played violin and cello and she interpreted that instruments in various chamber groups. Being motivated by one of her patients, Carleen Hutchinson, a science and music teacher, she made four instruments, viola, violin, cello, and mezzo violin. Nearly twenty years of her death, on October 24 1994, on the occasion of the annual meeting of the American Academy of Pediatrics and the issue by American Postal Service of a stamp honoring her, some of the preferred Dr. Apgar music pieces where performed with the instruments she made. Her life mixed different activities and let invaluable contributions for humanity.
A variational approach for calculating Franck-Condon factors including mode-mode anharmonic coupling
Resumo:
We have implemented our new procedure for computing Franck-Condon factors utilizing vibrational configuration interaction based on a vibrational self-consistent field reference. Both Duschinsky rotations and anharmonic three-mode coupling are taken into account. Simulations of the first ionization band of Cl O2 and C4 H4 O (furan) using up to quadruple excitations in treating anharmonicity are reported and analyzed. A developer version of the MIDASCPP code was employed to obtain the required anharmonic vibrational integrals and transition frequencies
Resumo:
Continuamente aparecen nuevas plataformas de gestión de cartografía web, con el inconveniente de que cada una de ellas utiliza un API propia. Dada la gran heterogeneidad de APIs de Mapas existente, sería conveniente disponer de una librería de mapas capaz de abstraer al desarrollador de las pequeñas diferencias entre ellas. Este es el objetivo de la librería Javascript de código abierto Mapstraction. Este tipo de API recibe el nombre de «API Universal y Políglota». Gracias a Mapstraction se pueden desarrollar aplicaciones en las que el usuario puede visualizar la información cartográfica con varios proveedores, pero presenta el inconveniente de no proporcionar mecanismos de creación y/o edición. En este documento se recogen las principales novedades que presenta la librería IDELab MapstractionInteractive, una extensión de Mapstraction que ofrece nueva funcionalidad para solventar las carencias de ésta. Las nuevas funcionalidades implementadas para los proveedores que se incluyen dentro de la librería brindan al usuario la posibilidad de poder editar y crear geometrías sobre el mapa (puntos, líneas y polígonos). Por otra parte, también se implementan dentro de la librería nuevos eventos para los mapas, de forma que el programador puede tener un mayor control de lo que el usuario hace sobre éstos
Resumo:
Would a research assistant - who can search for ideas related to those you are working on, network with others (but only share the things you have chosen to share), doesn’t need coffee and who might even, one day, appear to be conscious - help you get your work done? Would it help your students learn? There is a body of work showing that digital learning assistants can be a benefit to learners. It has been suggested that adaptive, caring, agents are more beneficial. Would a conscious agent be more caring, more adaptive, and better able to deal with changes in its learning partner’s life? Allow the system to try to dynamically model the user, so that it can make predictions about what is needed next, and how effective a particular intervention will be. Now, given that the system is essentially doing the same things as the user, why don’t we design the system so that it can try to model itself in the same way? This should mimic a primitive self-awareness. People develop their personalities, their identities, through interacting with others. It takes years for a human to develop a full sense of self. Nobody should expect a prototypical conscious computer system to be able to develop any faster than that. How can we provide a computer system with enough social contact to enable it to learn about itself and others? We can make it part of a network. Not just chatting with other computers about computer ‘stuff’, but involved in real human activity. Exposed to ‘raw meaning’ – the developing folksonomies coming out of the learning activities of humans, whether they are traditional students or lifelong learners (a term which should encompass everyone). Humans have complex psyches, comprised of multiple strands of identity which reflect as different roles in the communities of which they are part – so why not design our system the same way? With multiple internal modes of operation, each capable of being reflected onto the outside world in the form of roles – as a mentor, a research assistant, maybe even as a friend. But in order to be able to work with a human for long enough to be able to have a chance of developing the sort of rich behaviours we associate with people, the system needs to be able to function in a practical and helpful role. Unfortunately, it is unlikely to get a free ride from many people (other than its developer!) – so it needs to be able to perform a useful role, and do so securely, respecting the privacy of its partner. Can we create a system which learns to be more human whilst helping people learn?
Resumo:
A traditional method of validating the performance of a flood model when remotely sensed data of the flood extent are available is to compare the predicted flood extent to that observed. The performance measure employed often uses areal pattern-matching to assess the degree to which the two extents overlap. Recently, remote sensing of flood extents using synthetic aperture radar (SAR) and airborne scanning laser altimetry (LIDAR) has made more straightforward the synoptic measurement of water surface elevations along flood waterlines, and this has emphasised the possibility of using alternative performance measures based on height. This paper considers the advantages that can accrue from using a performance measure based on waterline elevations rather than one based on areal patterns of wet and dry pixels. The two measures were compared for their ability to estimate flood inundation uncertainty maps from a set of model runs carried out to span the acceptable model parameter range in a GLUE-based analysis. A 1 in 5-year flood on the Thames in 1992 was used as a test event. As is typical for UK floods, only a single SAR image of observed flood extent was available for model calibration and validation. A simple implementation of a two-dimensional flood model (LISFLOOD-FP) was used to generate model flood extents for comparison with that observed. The performance measure based on height differences of corresponding points along the observed and modelled waterlines was found to be significantly more sensitive to the channel friction parameter than the measure based on areal patterns of flood extent. The former was able to restrict the parameter range of acceptable model runs and hence reduce the number of runs necessary to generate an inundation uncertainty map. A result of this was that there was less uncertainty in the final flood risk map. The uncertainty analysis included the effects of uncertainties in the observed flood extent as well as in model parameters. The height-based measure was found to be more sensitive when increased heighting accuracy was achieved by requiring that observed waterline heights varied slowly along the reach. The technique allows for the decomposition of the reach into sections, with different effective channel friction parameters used in different sections, which in this case resulted in lower r.m.s. height differences between observed and modelled waterlines than those achieved by runs using a single friction parameter for the whole reach. However, a validation of the modelled inundation uncertainty using the calibration event showed a significant difference between the uncertainty map and the observed flood extent. While this was true for both measures, the difference was especially significant for the height-based one. This is likely to be due to the conceptually simple flood inundation model and the coarse application resolution employed in this case. The increased sensitivity of the height-based measure may lead to an increased onus being placed on the model developer in the production of a valid model
Resumo:
This paper was given at a meeting of the Society held on 12 January 2006 and it discusses the relationship between academic research and developer-funded archaeology in Britain today, highlighting the strengths and weaknesses of each. It considers the relationship between archaeological theory and practice and discusses the changing roles of academics, fieldworkers and managers. It argues that important issues need to be resolved, including the dissemination of information from recent archaeological fieldwork and the use of ‘grey literature’ in informing more ambitious interpretations of the past.
Resumo:
It is becoming increasingly difficult to resource field archaeology outside the developer-funded context. Consequently it is difficult to engage the wider public in understanding the nature of archaeology and in the writing of its history from the study of material and environmental evidence. This paper describes a project funded by the UK's Heritage Lottery Fund designed to increase access by several means to a well-established and long-running archaeological excavation at the Iron Age and Roman Town at Silchester, Hampshire (UK).
Resumo:
This paper describes the main changes of Commons Act 2006 for the registration of land as a town or village green. The purpose of the Commons Act 2006 is to protect common land and promote sustainable farming, public access to the countryside and the interests of wildlife. The changes under s15 of the Commons Act 2006 include the additional 2-year grace period for application, discounting statutory period of closure, correction of mistakes in registers, disallowing severance of rights, voluntary registration, replacement of land in exchange and some other provisions. The transitional provision contained in s15(4) Commons Act 2006 is particularly a cause for controversy as DEFRA has indicated buildings will have to be taken down where development has gone ahead and a subsequent application to register the land as a green is successful, obliging the developer to return the land to a condition consistent with the exercise by locals of recreational rights, which sums up that it would be harder in future to develop land which has the potential to be registered as a town or village green.
Resumo:
Organizational issues are inhibiting the implementation and strategic use of information technologies (IT) in the construction sector. This paper focuses on these issues and explores processes by which emerging technologies can be introduced into construction organizations. The paper is based on a case study, conducted in a major house building company that was implementing a virtual reality (VR) system for internal design review in the regional offices. Interviews were conducted with different members of the organization to explore the introduction process and the use of the system. The case study findings provide insight into the process of change, the constraints that inhibit IT implementation and the relationship between new technology and work patterns within construction organizations. They suggest that (1) user-developer communications are critical for the successful implementation of non-diffused innovations in the construction industry; and (2) successful uptake of IT requires both strategic decision-making by top management and decision-making by technical managers.
Resumo:
Global financial activity is heavily concentrated in a small number of world cities –international financial centers. The office markets in those cities receive significant flows of investment capital. The growing specialization of activity in IFCs and innovations in real estate investment vehicles lock developer, occupier, investment, and finance markets together, creating common patterns of movement and transmitting shocks from one office market throughout the system. International real estate investment strategies that fail to recognize this common source of volatility and risk may fail to deliver the diversification benefits sought.
Resumo:
Real estate development appraisal is a quantification of future expectations. The appraisal model relies upon the valuer/developer having an understanding of the future in terms of the future marketability of the completed development and the future cost of development. In some cases the developer has some degree of control over the possible variation in the variables, as with the cost of construction through the choice of specification. However, other variables, such as the sale price of the final product, are totally dependent upon the vagaries of the market at the completion date. To try to address the risk of a different outcome to the one expected (modelled) the developer will often carry out a sensitivity analysis on the development. However, traditional sensitivity analysis has generally only looked at the best and worst scenarios and has focused on the anticipated or expected outcomes. This does not take into account uncertainty and the range of outcomes that can happen. A fuller analysis should include examination of the uncertainties in each of the components of the appraisal and account for the appropriate distributions of the variables. Similarly, as many of the variables in the model are not independent, the variables need to be correlated. This requires a standardised approach and we suggest that the use of a generic forecasting software package, in this case Crystal Ball, allows the analyst to work with an existing development appraisal model set up in Excel (or other spreadsheet) and to work with a predetermined set of probability distributions. Without a full knowledge of risk, developers are unable to determine the anticipated level of return that should be sought to compensate for the risk. This model allows the user a better understanding of the possible outcomes for the development. Ultimately the final decision will be made relative to current expectations and current business constraints, but by assessing the upside and downside risks more appropriately, the decision maker should be better placed to make a more informed and “better”.
Resumo:
The application of real options theory to commercial real estate has developed rapidly during the last 15 Years. In particular, several pricing models have been applied to value real options embedded in development projects. In this study we use a case study of a mixed use development scheme and identify the major implied and explicit real options available to the developer. We offer the perspective of a real market application by exploring different binomial models and the associated methods of estimating the crucial parameter of volatility. We include simple binomial lattices, quadranomial lattices and demonstrate the sensitivity of the results to the choice of inputs and method.
Resumo:
Many countries in northern Europe have seen a huge expansion in development-led archaeology over the past few decades. Legislation, frameworks for heritage management and codes of practice have developed along similar but different lines. The Valetta Convention has had considerable impact on spatial planning and new legislation on archaeological heritage management within EC countries as well as on the funding, nature and distribution of archaeological fieldwork. For the first time these 12 papers bring together data on developer-led archaeology in Britain, Ireland, France, the Low Countries, Germany and Denmark in order to review and evaluate key common issues relating to organisation, practice, legal frameworks and quality management.
Resumo:
Purpose – This study aims to provide a review of brownfield policy and the emerging sustainable development agenda in the UK, and to examine the development industry’s (both commercial and residential) role and attitudes towards brownfield regeneration and contaminated land. Design/methodology/approach – The paper analyses results from a two-stage survey of commercial and residential developers carried out in mid-2004, underpinned by structured interviews with 11 developers. Findings – The results suggest that housebuilding on brownfield is no longer the preserve of specialists, and is now widespread throughout the industry in the UK. The redevelopment of contaminated sites for residential use could be threatened by the impact of the EU Landfill Directive. The findings also suggest that developers are not averse to developing on contaminated sites, although post-remediation stigma remains an issue. The market for warranties and insurance continues to evolve. Research limitations/implications – The survey is based on a sample which represents nearly 30 per cent of UK volume housebuilding. Although the response in the smaller developer groups was relatively under-represented, non-response bias was not found to be a significant issue. More research is needed to assess the way in which developers approach brownfield regeneration at a local level. Practical implications – The research suggests that clearer Government guidance in the UK is needed on how to integrate concepts of sustainability in brownfield development and that EU policy, which has been introduced for laudable aims, is creating tensions within the development industry. There may be an emphasis towards greenfield development in the future, as the implications of the Barker review are felt. Originality/value – This is a national survey of developers’ attitudes towards brownfield development in the UK, following the Barker Review, and highlights key issues in UK and EU policy layers. Keywords Brownfield sites, Contamination Paper type Research paper