916 resultados para web development
Resumo:
The Perspex Machine arose from the unification of computation with geometry. We now report significant redevelopment of both a partial C compiler that generates perspex programs and of a Graphical User Interface (GUI). The compiler is constructed with standard compiler-generator tools and produces both an explicit parse tree for C and an Abstract Syntax Tree (AST) that is better suited to code generation. The GUI uses a hash table and a simpler software architecture to achieve an order of magnitude speed up in processing and, consequently, an order of magnitude increase in the number of perspexes that can be manipulated in real time (now 6,000). Two perspex-machine simulators are provided, one using trans-floating-point arithmetic and the other using transrational arithmetic. All of the software described here is available on the world wide web. The compiler generates code in the neural model of the perspex. At each branch point it uses a jumper to return control to the main fibre. This has the effect of pruning out an exponentially increasing number of branching fibres, thereby greatly increasing the efficiency of perspex programs as measured by the number of neurons required to implement an algorithm. The jumpers are placed at unit distance from the main fibre and form a geometrical structure analogous to a myelin sheath in a biological neuron. Both the perspex jumper-sheath and the biological myelin-sheath share the computational function of preventing cross-over of signals to neurons that lie close to an axon. This is an example of convergence driven by similar geometrical and computational constraints in perspex and biological neurons.
Resumo:
The report examines the development of the Internet and Intranets in the world of business and commerce, drawing on previous literature and research. The new technology is explained, and key issues examined, such as the impact of the Internet on the surveyor's role as 'information broker' and its likely effect on clients' property requirements. The research is based on an analysis of 261 postal questionnaire responses and eight case study interviews from a sample of general practice and quantity surveying practices and corporates. For the first time the property profession is examined in detail and the key drivers, barriers and benefits of Internet use are identified for a range of different sized organisations.
Resumo:
Improving lifestyle behaviours has considerable potential for reducing the global burden of non-communicable diseases, promoting better health across the life-course and increasing well-being. However, realising this potential will require the development, testing and implementation of much more effective behaviour change interventions than are used conventionally. Therefore, the aim of this study was to conduct a multi-centre, web-based, proof-of-principle study of personalised nutrition (PN) to determine whether providing more personalised dietary advice leads to greater improvements in eating patterns and health outcomes compared to conventional population-based advice. A total of 5,562 volunteers were screened across seven European countries; the first 1,607 participants who fulfilled the inclusion criteria were recruited into the trial. Participants were randomly assigned to one of the following intervention groups for a 6-month period: Level 0-control group-receiving conventional, non-PN advice; Level 1-receiving PN advice based on dietary intake data alone; Level 2-receiving PN advice based on dietary intake and phenotypic data; and Level 3-receiving PN advice based on dietary intake, phenotypic and genotypic data. A total of 1,607 participants had a mean age of 39.8 years (ranging from 18 to 79 years). Of these participants, 60.9 % were women and 96.7 % were from white-European background. The mean BMI for all randomised participants was 25.5 kg m(-2), and 44.8 % of the participants had a BMI ≥ 25.0 kg m(-2). Food4Me is the first large multi-centre RCT of web-based PN. The main outcomes from the Food4Me study will be submitted for publication during 2015.
Resumo:
Background 29 autoimmune diseases, including Rheumatoid Arthritis, gout, Crohn’s Disease, and Systematic Lupus Erythematosus affect 7.6-9.4% of the population. While effective therapy is available, many patients do not follow treatment or use medications as directed. Digital health and Web 2.0 interventions have demonstrated much promise in increasing medication and treatment adherence, but to date many Internet tools have proven disappointing. In fact, most digital interventions continue to suffer from high attrition in patient populations, are burdensome for healthcare professionals, and have relatively short life spans. Objective Digital health tools have traditionally centered on the transformation of existing interventions (such as diaries, trackers, stage-based or cognitive behavioral therapy programs, coupons, or symptom checklists) to electronic format. Advanced digital interventions have also incorporated attributes of Web 2.0 such as social networking, text messaging, and the use of video. Despite these efforts, there has not been little measurable impact in non-adherence for illnesses that require medical interventions, and research must look to other strategies or development methodologies. As a first step in investigating the feasibility of developing such a tool, the objective of the current study is to systematically rate factors of non-adherence that have been reported in past research studies. Methods Grounded Theory, recognized as a rigorous method that facilitates the emergence of new themes through systematic analysis, data collection and coding, was used to analyze quantitative, qualitative and mixed method studies addressing the following autoimmune diseases: Rheumatoid Arthritis, gout, Crohn’s Disease, Systematic Lupus Erythematosus, and inflammatory bowel disease. Studies were only included if they contained primary data addressing the relationship with non-adherence. Results Out of the 27 studies, four non-modifiable and 11 modifiable risk factors were discovered. Over one third of articles identified the following risk factors as common contributors to medication non-adherence (percent of studies reporting): patients not understanding treatment (44%), side effects (41%), age (37%), dose regimen (33%), and perceived medication ineffectiveness (33%). An unanticipated finding that emerged was the need for risk stratification tools (81%) with patient-centric approaches (67%). Conclusions This study systematically identifies and categorizes medication non-adherence risk factors in select autoimmune diseases. Findings indicate that patients understanding of their disease and the role of medication are paramount. An unexpected finding was that the majority of research articles called for the creation of tailored, patient-centric interventions that dispel personal misconceptions about disease, pharmacotherapy, and how the body responds to treatment. To our knowledge, these interventions do not yet exist in digital format. Rather than adopting a systems level approach, digital health programs should focus on cohorts with heterogeneous needs, and develop tailored interventions based on individual non-adherence patterns.
Resumo:
This study investigates how the summer thunderstorms developed over the city of Sao Paulo and if the pollution might affect its development or characteristics during the austral summer (December-January-February-March, DJFM months). A total of 605 days from December 1999 to March 2004 was separated as 241 thunderstorms days (TDs) and 364 non-thunderstorm days (NTDs). The analyses are performed by using hourly measurements of air temperature (T), web-bulb temperature (Tw), surface atmospheric pressure (P), wind velocity and direction, rainfall and thunder and lightning observations collected at the Meteorological Station of the University of Sao Paulo in conjunction with aerosol measurements obtained by AERONET (Aerosol Robotic Network), and the NCEP-DOE (National Centers for Environmental Prediction Department of Energy) reanalysis and radiosondes. The wind diurnal cycle shows that for TDs the morning flow is from the northwest rotating to the southeast after 16: 00 local time (LT) and it remains from the east until the night. For the NTDs, the wind is well characterized by the sea-breeze circulation that in the morning has the wind blowing from the northeast and in the afternoon from the southeast. The TDs show that the air temperature diurnal cycle presents higher amplitude and the maximum temperature of the day is 3.2 degrees C higher than in NTDs. Another important factor found is the difference between moisture that is higher during TDs. In terms of precipitation, the TDs represent 40% of total of days analyzed and those days are responsible for more than 60% of the total rain accumulation during the summer, for instance 50% of the TDs had more than 15.5mm day(-1) while the NTDs had 4 mm day(-1). Moreover, the rainfall distribution shows that TDs have higher rainfall rate intensities and an afternoon precipitation maximum; while in the NTDs there isn`t a defined precipitation diurnal cycle. The wind and temperature fields from NCEP reanalysis concur with the local weather station and radiosonde observations. The NCEP composites show that TDs are controlled by synoptic circulation characterized by a pre-frontal situation, with a baroclinic zone situated at southern part of Sao Paulo. In terms of pollution, this study employed the AERONET data to obtain the main aerosol characteristics in the atmospheric column for both TDs and NTDs. The particle size distribution and particle volume size distribution have similar concentrations for both TDs and NTDs and present a similar fine and coarse mode mean radius. In respect to the atmospheric loading, the aerosol optical depth (AOD) at different frequencies presented closed mean values for both TDs and NTDs that were statistically significant at 95% level. The spectral dependency of those values in conjunction with the Angstrom parameter reveal the higher concentration of the fine mode particles that are more likely to be hygroscopic and from urban areas. In summary, no significant aerosol effect could be found on the development of summer thunderstorms, suggesting the strong synoptic control by the baroclinic forcing for deep convective development. (C) 2010 Published by Elsevier B. V.
Resumo:
The widespread use of service-oriented architectures (SOAs) and Web services in commercial software requires the adoption of development techniques to ensure the quality of Web services. Testing techniques and tools concern quality and play a critical role in accomplishing quality of SOA based systems. Existing techniques and tools for traditional systems are not appropriate to these new systems, making the development of Web services testing techniques and tools required. This article presents new testing techniques to automatically generate a set of test cases and data for Web services. The techniques presented here explore data perturbation of Web services messages upon data types, integrity and consistency. To support these techniques, a tool (GenAutoWS) was developed and applied to real problems. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
The purpose of this presentation is to discuss how teacher’s leadership can be used as a teaching method in web based language education. The environments that offer online courses provide a wide field for discussion on the contact between teacher and student. My intention is to contribute to the debate on teacher leadership in online courses. In my earlier studies on leadership, I have explored how some religious leaders affected different social movements in Brazil during the military dictatorship (1964-1985). Pruth (2004) by examining the three kinds of legitimacy described by Max Weber I aimed at seeing and analyzing how religious leaders used different teaching methods to explain their messages to ordinary citizens. Thus my research showed how educational leadership is a way to get people to reach their goals. I became interested in the subject teacher’s leadership whenI participated in a survey of the teaching methods of language courses in Dalarna University which is funded by the NGL Center of Dalarna University. In this project, we have made interviews with the teachers, undertaken the course plans (in the language department at Dalarna University) and categorized the learning outcomes. A questionnaire was constructed based on the learning outcomes and then either sent out remotely to teachers or completed face to face through interviews. The answers provided to the questionnaires enabled the project to identify many differences in how language teachers interact with their students but also, the way of giving feedback, motivating and helping students, types of class activities and materials used. This made me aware of how teachers use their leadership or not in their teaching. My focus is to look at the relationship between teachers and students as an important part of the development and quality of online courses. The teacher's performance on campus is different from online courses. I want to understand how the contact between teachers and students in online courses develop and look at how students can make use of this contact and what influence the teacher's leadership has on the ability for the students to achieve the goals of their course
Resumo:
Service-based architectures enable the development of new classes of Grid and distributed applications. One of the main capabilities provided by such systems is the dynamic and flexible integration of services, according to which services are allowed to be a part of more than one distributed system and simultaneously serve different applications. This increased flexibility in system composition makes it difficult to address classical distributed system issues such as fault-tolerance. While it is relatively easy to make an individual service fault-tolerant, improving fault-tolerance of services collaborating in multiple application scenarios is a challenging task. In this paper, we look at the issue of developing fault-tolerant service-based distributed systems, and propose an infrastructure to implement fault tolerance capabilities transparent to services.
Resumo:
GCM outputs such as CMIP3 are available via network access to PCMDI web site. Meteorological researchers are familiar with the usage of the GCM data, but the most of researchers other than meteorology such as agriculture, civil engineering, etc., and general people are not familiar with the GCM. There are some difficulties to use GCM; 1) to download the enormous quantity of data, 2) to understand the GCM methodology, parameters and grids. In order to provide a quick access way to GCM, Climate Change Information Database has been developed. The purpose of the database is to bridge the users and meteorological specialists and to facilitate the understanding the climate changes. The resolution of the data is unified, and climate change amount or factors for each meteorological element are provided from the database. All data in the database are interpolated on the same 80km mesh. Available data are the present-future projections of 27 GCMs, 16 meteorological elements (precipitation, temperature, etc.), 3 emission scenarios (A1B, A2, B1). We showed the summary of this database to residents in Toyama prefecture and measured the effect of showing and grasped the image for the climate change by using the Internet questionary survey. The persons who feel a climate change at the present tend to feel the additional changes in the future. It is important to show the monitoring results of climate change for a citizen and promote the understanding for the climate change that had already occurred. It has been shown that general images for the climate change promote to understand the need of the mitigation, and that it is important to explain about the climate change that might occur in the future even if it did not occur at the present in order to have people recognize widely the need of the adaptation.
Resumo:
The objective of this study is to develop a Pollution Early Warning System (PEWS) for efficient management of water quality in oyster harvesting areas. To that end, this paper presents a web-enabled, user-friendly PEWS for managing water quality in oyster harvesting areas along Louisiana Gulf Coast, USA. The PEWS consists of (1) an Integrated Space-Ground Sensing System (ISGSS) gathering data for environmental factors influencing water quality, (2) an Artificial Neural Network (ANN) model for predicting the level of fecal coliform bacteria, and (3) a web-enabled, user-friendly Geographic Information System (GIS) platform for issuing water pollution advisories and managing oyster harvesting waters. The ISGSS (data acquisition system) collects near real-time environmental data from various sources, including NASA MODIS Terra and Aqua satellites and in-situ sensing stations managed by the USGS and the NOAA. The ANN model is developed using the ANN program in MATLAB Toolbox. The ANN model involves a total of 6 independent environmental variables, including rainfall, tide, wind, salinity, temperature, and weather type along with 8 different combinations of the independent variables. The ANN model is constructed and tested using environmental and bacteriological data collected monthly from 2001 – 2011 by Louisiana Molluscan Shellfish Program at seven oyster harvesting areas in Louisiana Coast, USA. The ANN model is capable of explaining about 76% of variation in fecal coliform levels for model training data and 44% for independent data. The web-based GIS platform is developed using ArcView GIS and ArcIMS. The web-based GIS system can be employed for mapping fecal coliform levels, predicted by the ANN model, and potential risks of norovirus outbreaks in oyster harvesting waters. The PEWS is able to inform decision-makers of potential risks of fecal pollution and virus outbreak on a daily basis, greatly reducing the risk of contaminated oysters to human health.
Resumo:
A presente dissertação versa sobre o uso da Tecnologia da Informação e Comunicação aplicada aos processos de gestão pública à luz dos conceitos de eficácia, eficiência e accountabítlity. Para tanto, este estudo se funda sobre dois marcos teóricos. O primeiro trata do desenvolvimento científico-tecnológico e suas implicações na construção de uma sociedade resultante da interação da microeletrônica, da informatização e da telecomunicação. O segundo diz respeito à reforma do Estado Brasileiro, num contexto em que se discute a necessidade de torná-lo mais ágil, flexível e mais responsável perante a sociedade. Metodologicamente, valemo-nos do estudo de caso múltiplo, no qual analisamos o Pregão Eletrônico utilizado pelo Governo Federal para aquisição de bens e serviços nos moldes do leilão reverso do mercado de flores de Amsterdã. Especificamente, trata-se do Pregão 21/2001, realizado pelo Ministério da Previdência e Assistência Social, para compra de medicamentos. Este estudo contempla não só os diferentes aspectos do comércio eletrônico, como o procurement, mas também descreve o processo tradicional de licitações públicas. Ao final, concluímos que a adoção da tecnologia da informação aplicada à gestão pública, em especial como ferramenta para aquisição de bens e serviços, mostrouse eficiente ao promover uma redução de custos, tanto dos processos governamentais, quanto dos produtos adquiridos, fato extremamente relevante se considerarmos a realidade orçamentária brasileira. Ficou também comprovada a sua eficácia, evidenciada pela redução do tempo necessário à realização do procedimento, uma vez comparado ao processo tradicional de licitações pública. Por outro lado, podemos afirmar que a iniciativa amplia o grau de transparência das informações do setor público brasileiro, reconfigurando as relações EstadoSociedade.
Resumo:
With the constant grow of enterprises and the need to share information across departments and business areas becomes more critical, companies are turning to integration to provide a method for interconnecting heterogeneous, distributed and autonomous systems. Whether the sales application needs to interface with the inventory application, the procurement application connect to an auction site, it seems that any application can be made better by integrating it with other applications. Integration between applications can face several troublesome due the fact that applications may not have been designed and implemented having integration in mind. Regarding to integration issues, two tier software systems, composed by the database tier and by the “front-end” tier (interface), have shown some limitations. As a solution to overcome the two tier limitations, three tier systems were proposed in the literature. Thus, by adding a middle-tier (referred as middleware) between the database tier and the “front-end” tier (or simply referred application), three main benefits emerge. The first benefit is related with the fact that the division of software systems in three tiers enables increased integration capabilities with other systems. The second benefit is related with the fact that any modifications to the individual tiers may be carried out without necessarily affecting the other tiers and integrated systems and the third benefit, consequence of the others, is related with less maintenance tasks in software system and in all integrated systems. Concerning software development in three tiers, this dissertation focus on two emerging technologies, Semantic Web and Service Oriented Architecture, combined with middleware. These two technologies blended with middleware, which resulted in the development of Swoat framework (Service and Semantic Web Oriented ArchiTecture), lead to the following four synergic advantages: (1) allow the creation of loosely-coupled systems, decoupling the database from “front-end” tiers, therefore reducing maintenance; (2) the database schema is transparent to “front-end” tiers which are aware of the information model (or domain model) that describes what data is accessible; (3) integration with other heterogeneous systems is allowed by providing services provided by the middleware; (4) the service request by the “frontend” tier focus on ‘what’ data and not on ‘where’ and ‘how’ related issues, reducing this way the application development time by developers.
Resumo:
Over the years the use of application frameworks designed for the View and Controller layers of MVC architectural pattern adapted to web applications has become very popular. These frameworks are classified into Actions Oriented and Components Oriented , according to the solution strategy adopted by the tools. The choice of such strategy leads the system architecture design to acquire non-functional characteristics caused by the way the framework influences the developer to implement the system. The components reusability is one of those characteristics and plays a very important role for development activities such as system evolution and maintenance. The work of this dissertation consists to analyze of how the reusability could be influenced by the Web frameworks usage. To accomplish this, small academic management applications were developed using the latest versions of Apache Struts and JavaServer Faces frameworks, the main representatives of Java plataform Web frameworks of. For this assessment was used a software quality model that associates internal attributes, which can be measured objectively, to the characteristics in question. These attributes and metrics defined for the model were based on some work related discussed in the document
Resumo:
Recently the focus given to Web Services and Semantic Web technologies has provided the development of several research projects in different ways to addressing the Web services composition issue. Meanwhile, the challenge of creating an environment that provides the specification of an abstract business process and that it is automatically implemented by a composite service in a dynamic way is considered a currently open problem. WSDL and BPEL provided by industry support only manual service composition because they lack needed semantics so that Web services are discovered, selected and combined by software agents. Services ontology provided by Semantic Web enriches the syntactic descriptions of Web services to facilitate the automation of tasks, such as discovery and composition. This work presents an environment for specifying and ad-hoc executing Web services-based business processes, named WebFlowAH. The WebFlowAH employs common domain ontology to describe both Web services and business processes. It allows processes specification in terms of users goals or desires that are expressed based on the concepts of such common domain ontology. This approach allows processes to be specified in an abstract high level way, unburdening the user from the underline details needed to effectively run the process workflow
Resumo:
The World Wide Web has been consolidated over the last years as a standard platform to provide software systems in the Internet. Nowadays, a great variety of user applications are available on the Web, varying from corporate applications to the banking domain, or from electronic commerce to the governmental domain. Given the quantity of information available and the quantity of users dealing with their services, many Web systems have sought to present recommendations of use as part of their functionalities, in order to let the users to have a better usage of the services available, based on their profile, history navigation and system use. In this context, this dissertation proposes the development of an agent-based framework that offers recommendations for users of Web systems. It involves the conception, design and implementation of an object-oriented framework. The framework agents can be plugged or unplugged in a non-invasive way in existing Web applications using aspect-oriented techniques. The framework is evaluated through its instantiation to three different Web systems