380 resultados para developer
Resumo:
Tulevaisuudessa telekommunikaatioala tulee keskittymään pitkälti langattomiin sovelluksiin ja lisäarvopalveluihin. Tuottaakseen näitä palveluja alan yritykset tekevät yhteistyötä laajan kehittäjäjoukon kanssa. Työn tavoitteena oli parantaa case-yrityksen jo olemassaolevaa toimintamallia, jota se soveltaa yhteistyössään kehittäjien kanssa. Tutkimus keskittyy mobiiliapplikaatiokehittäjiin. Toimintamalli kattaa pääasiassa palvelutarjonnan kehittäjä-allianssissa.Jotta toimintamalliin pystyttäisiin tekemään strategisia muutoksia, oli aluksi tärkeä tunnistaa kehittäjien tarpeet ja toiseksi tarkkailla ja analysoida ympäristöä ja sillä tavoin tunnistaa pääkilpailijat ja heidän tarjontansa mobiiliapplikaatiokehittäjille. Tutkimus toteutettiin suorittamalla postikysely kehittäjille ja toisaalta tekemällä laadullinen tutkimus kilpailijoista. Kilpailutilanteen luonne ja potentiaaliset kilpailijat olivat tunnistettavissa. Parannusehdotukset sisälsivät sekä yleisiä että palvelukohtaisia parannuksia.
Resumo:
Purpose – Expectations of future market conditions are acknowledged to be crucial for the development decision and hence for shaping the built environment. The purpose of this paper is to study the central London office market from 1987 to 2009 and test for evidence of rational, adaptive and naive expectations. Design/methodology/approach – Two parallel approaches are applied to test for either rational or adaptive/naive expectations: vector auto-regressive (VAR) approach with Granger causality tests and recursive OLS regression with one-step forecasts. Findings – Applying VAR models and a recursive OLS regression with one-step forecasts, the authors do not find evidence of adaptive and naïve expectations of developers. Although the magnitude of the errors and the length of time lags between market signal and construction starts vary over time and development cycles, the results confirm that developer decisions are explained, to a large extent, by contemporaneous and historic conditions in both the City and the West End, but this is more likely to stem from the lengthy design, financing and planning permission processes rather than adaptive or naive expectations. Research limitations/implications – More generally, the results of this study suggest that real estate cycles are largely generated endogenously rather than being the result of large demand shocks and/or irrational behaviour. Practical implications – Developers may be able to generate excess profits by exploiting market inefficiencies but this may be hindered in practice by the long periods necessary for planning and construction of the asset. Originality/value – This paper focuses the scholarly debate of real estate cycles on the role of expectations. It is also one of very few spatially disaggregate studies of the subject matter.
Resumo:
This paper reviews the evidence in favour of the compact city and considers whether it is a viable policy option. Environmentalists, acadenics and politicians have all expressed strong support for the compact city as a basis for sustainable development. A review of the literature broadly confirms the claims made on its behalf, in particular that it is energy efficient and that it plays a crucial role in preventing rural land loss. It is further shown i) that there is nothing inevitable about the established pattern of urban dispersal, and ii) that although urban land is charaterised by a number of contstraints on development,it could in principle satisfy much of the projected demand for housing. Yet urban sprawl continues. Some of the reasons for this in the case of residential development are examined by comparing the residential development process with the principles of sustainable development. The general conclusion of the paper is that proposals for urban containment are likely to be strongly resisted by housebuilders.
Resumo:
Although the compact city is widely promoted as a sustainable form of urban development, little attention has been paid to the feasilibity of its implementation in practice. This paper addresses this isse by presenting the findings of a questionnaire study into the viability of the compact city from the perspective of the volume housebuilder. The study shows that, although well aware of the inherent problems witht the compact city, most were generally positive about the need to redirect more development back into urban areas. In addition, they suggested a large number of strategies for change, including i) a role for public sector agencies in overcoming the condstraints on urban sites; ii) the need for an upward reveision of acceptable densities in local plans and design guides; and iii) the creation of a separate, or revised, Use Class Order to enable mixed use development to compete on a level playing field. It is concluded that the residential developer could be engaged in the process of urban containment provided proposals for implementing the compact city of devised. The need for continuing research to test the actual effects of specific schemes is emphasised.
Resumo:
Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.
Resumo:
The promise of search-driven development is that developers will save time and resources by reusing external code in their local projects. To efficiently integrate this code, users must be able to trust it, thus trustability of code search results is just as important as their relevance. In this paper, we introduce a trustability metric to help users assess the quality of code search results and therefore ease the cost-benefit analysis they undertake trying to find suitable integration candidates. The proposed trustability metric incorporates both user votes and cross-project activity of developers to calculate a "karma" value for each developer. Through the karma value of all its developers a project is ranked on a trustability scale. We present JBENDER, a proof-of-concept code search engine which implements our trustability metric and we discuss preliminary results from an evaluation of the prototype.
Resumo:
What was I working on before the weekend? and What were the members of my team working on during the last week? are common questions that are frequently asked by a developer. They can be answered if one keeps track of who changes what in the source code. In this work, we present Replay, a tool that allows one to replay past changes as they happened at a fine-grained level, where a developer can watch what she has done or understand what her colleagues have done in past development sessions. With this tool, developers are able to not only understand what sequence of changes brought the system to a certain state (e.g., the introduction of a defect), but also deduce reasons for why her colleagues performed those changes. One of the applications of such a tool is also discovering the changes that broke the code of a developer.
Resumo:
We present the results of an investigation into the nature of the information needs of software developers who work in projects that are part of larger ecosystems. In an open- question survey we asked framework and library developers about their information needs with respect to both their upstream and downstream projects. We investigated what kind of information is required, why is it necessary, and how the developers obtain this information. The results show that the downstream needs are grouped into three categories roughly corresponding to the different stages in their relation with an upstream: selection, adop- tion, and co-evolution. The less numerous upstream needs are grouped into two categories: project statistics and code usage. The current practices part of the study shows that to sat- isfy many of these needs developers use non-specific tools and ad hoc methods. We believe that this is a largely unexplored area of research.
Resumo:
We present the results of an investigation into the nature of information needs of software developers who work in projects that are part of larger ecosystems. This work is based on a quantitative survey of 75 professional software developers. We corroborate the results identified in the sur- vey with needs and motivations proposed in a previous sur- vey and discover that tool support for developers working in an ecosystem context is even more meager than we thought: mailing lists and internet search are the most popular tools developers use to satisfy their ecosystem-related information needs.
Resumo:
The Local Urban Observatory in Nakuru (LUO, Kenya 2003) has developed a progressive and to date unique electronic information service called NakInfo. The objective of LUO is to make residents aware of public services delivery by their Local Authority, in this case the Municipal Council of Nakuru, and give them a voice in achieving improved quality of life. NakInfo facilitates community participation in local government business and demonstrates how to implement such participation in a developing country. The LUO project was formally initiated by the Municipal Council of Nakuru in January 2003, in collaboration with the Centre for Development and Environment (CDE) of the University of Berne (Switzerland) with funding from the Swiss Agency for Development and Cooperation (SDC).