907 resultados para User-Designer Collaboration, Problem Restructuring, Scenario Building
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
One of the actual biggest problems in big towns is the visual pollution. The advertising has become a necessity for any type of business, resulting in the use of as urban spaces as possible like fronts structures, lights, with too many colors and details to keep the attention of sight, thus becoming an important problem for the social and cultural coexistence. The work of graphic designer can reduce this problem through the concepts of visual identity applied on the enterprise building front, giving harmony in the colors and spaces where they can apply the advertising without to attack the urban beauty and esthetics.
Resumo:
Research literature is replete with the importance of collaboration in schools, the lack of its implementation, the centrality of the role of the principal, and the existence of a gap between knowledge and practice--or a "Knowing-Doing Gap." In other words, there is a set of knowledge that principals must know in order to create a collaborative workplace environment for teachers. This study sought to describe what high school principals know about creating such a culture of collaboration. The researcher combed journal articles, studies and professional literature in order to identify what principals must know in order to create a culture of collaboration. The result was ten elements of principal knowledge: Staff involvement in important decisions, Charismatic leadership not being necessary for success, Effective elements of teacher teams, Administrator‘s modeling professional learning, The allocation of resources, Staff meetings focused on student learning, Elements of continuous improvement, and Principles of Adult Learning, Student Learning and Change. From these ten elements, the researcher developed a web-based survey intended to measure nine of those elements (Charismatic leadership was excluded). Principals of accredited high schools in the state of Nebraska were invited to participate in this survey, as high schools are well-known for the isolation that teachers experience--particularly as a result of departmentalization. The results indicate that principals have knowledge of eight of the nine measured elements. The one that they lacked an understanding of was Principles of Student Learning. Given these two findings of what principals do and do not know, the researcher recommends that professional organizations, intermediate service agencies and district-level support staff engage in systematic and systemic initiatives to increase the knowledge of principals in the element of lacking knowledge. Further, given that eight of the nine elements are understood by principals, it would be wise to examine reasons for the implementation gap (Knowing-Doing Gap) and how to overcome it.
Resumo:
The purpose of this study is to determine if students solve math problems using addition, subtraction, multiplication, and division consistently and whether students transfer these skills to other mathematical situations and solutions. In this action research study, a classroom of 6th grade mathematics students was used to investigate how students solve word problems and how they determine which mathematical approach to use to solve a problem. It was discovered that many of the students read and re-read a question before they try to find an answer. Most students will check their answer to determine if it is correct and makes sense. Most students agree that mastering basic math facts is very important for problem solving and prefer mathematics that does not focus on problem solving. As a result of this research, it will be emphasized to the building principal and staff the need for a unified and focused curriculum with a scope and sequence for delivery that is consistently followed. The importance of managing basic math skills and making sure each student is challenged to be a mathematical thinker will be stressed.
Resumo:
End users develop more software than any other group of programmers, using software authoring devices such as e-mail filtering editors, by-demonstration macro builders, and spreadsheet environments. Despite this, there has been little research on finding ways to help these programmers with the dependability of their software. We have been addressing this problem in several ways, one of which includes supporting end-user debugging activities through fault localization techniques. This paper presents the results of an empirical study conducted in an end-user programming environment to examine the impact of two separate factors in fault localization techniques that affect technique effectiveness. Our results shed new insights into fault localization techniques for end-user programmers and the factors that affect them, with significant implications for the evaluation of those techniques.
Resumo:
Not long ago, most software was written by professional programmers, who could be presumed to have an interest in software engineering methodologies and in tools and techniques for improving software dependability. Today, however, a great deal of software is written not by professionals but by end-users, who create applications such as multimedia simulations, dynamic web pages, and spreadsheets. Applications such as these are often used to guide important decisions or aid in important tasks, and it is important that they be sufficiently dependable, but evidence shows that they frequently are not. For example, studies have shown that a large percentage of the spreadsheets created by end-users contain faults. Despite such evidence, until recently, relatively little research had been done to help end-users create more dependable software. We have been working to address this problem by finding ways to provide at least some of the benefits of formal software engineering techniques to end-user programmers. In this talk, focusing on the spreadsheet application paradigm, I present several of our approaches, focusing on methodologies that utilize source-code-analysis techniques to help end-users build more dependable spreadsheets. Behind the scenes, our methodologies use static analyses such as dataflow analysis and slicing, together with dynamic analyses such as execution monitoring, to support user tasks such as validation and fault localization. I show how, to accommodate the user base of spreadsheet languages, an interface to these methodologies can be provided in a manner that does not require an understanding of the theory behind the analyses, yet supports the interactive, incremental process by which spreadsheets are created. Finally, I present empirical results gathered in the use of our methodologies that highlight several costs and benefits trade-offs, and many opportunities for future work.
Resumo:
We propose an alternative, nonsingular, cosmic scenario based on gravitationally induced particle production. The model is an attempt to evade the coincidence and cosmological constant problems of the standard model (Lambda CDM) and also to connect the early and late time accelerating stages of the Universe. Our space-time emerges from a pure initial de Sitter stage thereby providing a natural solution to the horizon problem. Subsequently, due to an instability provoked by the production of massless particles, the Universe evolves smoothly to the standard radiation dominated era thereby ending the production of radiation as required by the conformal invariance. Next, the radiation becomes subdominant with the Universe entering in the cold dark matter dominated era. Finally, the negative pressure associated with the creation of cold dark matter (CCDM model) particles accelerates the expansion and drives the Universe to a final de Sitter stage. The late time cosmic expansion history of the CCDM model is exactly like in the standard Lambda CDM model; however, there is no dark energy. The model evolves between two limiting (early and late time) de Sitter regimes. All the stages are also discussed in terms of a scalar field description. This complete scenario is fully determined by two extreme energy densities, or equivalently, the associated de Sitter Hubble scales connected by rho(I)/rho(f) = (H-I/H-f)(2) similar to 10(122), a result that has no correlation with the cosmological constant problem. We also study the linear growth of matter perturbations at the final accelerating stage. It is found that the CCDM growth index can be written as a function of the Lambda growth index, gamma(Lambda) similar or equal to 6/11. In this framework, we also compare the observed growth rate of clustering with that predicted by the current CCDM model. Performing a chi(2) statistical test we show that the CCDM model provides growth rates that match sufficiently well with the observed growth rate of structure.
Resumo:
Ubiquitous Computing promises seamless access to a wide range of applications and Internet based services from anywhere, at anytime, and using any device. In this scenario, new challenges for the practice of software development arise: Applications and services must keep a coherent behavior, a proper appearance, and must adapt to a plenty of contextual usage requirements and hardware aspects. Especially, due to its interactive nature, the interface content of Web applications must adapt to a large diversity of devices and contexts. In order to overcome such obstacles, this work introduces an innovative methodology for content adaptation of Web 2.0 interfaces. The basis of our work is to combine static adaption - the implementation of static Web interfaces; and dynamic adaptation - the alteration, during execution time, of static interfaces so as for adapting to different contexts of use. In hybrid fashion, our methodology benefits from the advantages of both adaptation strategies - static and dynamic. In this line, we designed and implemented UbiCon, a framework over which we tested our concepts through a case study and through a development experiment. Our results show that the hybrid methodology over UbiCon leads to broader and more accessible interfaces, and to faster and less costly software development. We believe that the UbiCon hybrid methodology can foster more efficient and accurate interface engineering in the industry and in the academy.
Resumo:
The fast and strong social and economic transformations in the economies of many countries has raised the competition for consumers. One of the elements required to adapt to such scenario is knowing customers and their perceptions about products or services, mainly regarding word of mouth recommendations. This study adapts, to the fast food business, a model originally designed to analyze the antecedents of the intent to recommend by clients of formal restaurants. Three constructs were considered: service quality, satisfaction, and social well-being, the latter comprised of positive and negative affections. Six hypotheses were considered, three of which relating to social well-being (that it influences satisfaction, service quality, and the intent to recommend), two relating to service quality (that in influences the intent to recommend and satisfaction), and one relating to the influence of satisfaction on the intent to recommend. None was rejected, indicating adherence and adjustment of the simplication and adaptation of the consolidated model. Through a successful empirical application, the main contribution made by this research is the simplification of a model through its application in a similar context, but with a different scope.
Resumo:
Breakthrough advances in microprocessor technology and efficient power management have altered the course of development of processors with the emergence of multi-core processor technology, in order to bring higher level of processing. The utilization of many-core technology has boosted computing power provided by cluster of workstations or SMPs, providing large computational power at an affordable cost using solely commodity components. Different implementations of message-passing libraries and system softwares (including Operating Systems) are installed in such cluster and multi-cluster computing systems. In order to guarantee correct execution of message-passing parallel applications in a computing environment other than that originally the parallel application was developed, review of the application code is needed. In this paper, a hybrid communication interfacing strategy is proposed, to execute a parallel application in a group of computing nodes belonging to different clusters or multi-clusters (computing systems may be running different operating systems and MPI implementations), interconnected with public or private IP addresses, and responding interchangeably to user execution requests. Experimental results demonstrate the feasibility of this proposed strategy and its effectiveness, through the execution of benchmarking parallel applications.
Resumo:
Abstract Background This article aims to discuss the incorporation of traditional time in the construction of a management scenario for pink shrimp in the Patos Lagoon estuary (RS), Brazil. To meet this objective, two procedures have been adopted; one at a conceptual level and another at a methodological level. At the conceptual level, the concept of traditional time as a form of traditional ecological knowledge (TEK) was adopted. Method At the methodological level, we conduct a wide literature review of the scientific knowledge (SK) that guides recommendations for pink shrimp management by restricting the fishing season in the Patos Lagoon estuary; in addition, we review the ethno-scientific literature which describes traditional calendars as a management base for artisanal fishers in the Patos Lagoon estuary. Results Results demonstrate that TEK and SK describe similar estuarine biological processes, but are incommensurable at a resource management level. On the other hand, the construction of a “management scenario” for pink shrimp is possible through the development of “criteria for hierarchies of validity” which arise from a productive dialog between SK and TEK. Conclusions The commensurable and the incommensurable levels reveal different basis of time-space perceptions between traditional ecological knowledge and scientific knowledge. Despite incommensurability at the management level, it is possible to establish guidelines for the construction of “management scenarios” and to support a co-management process.
Resumo:
AIM: To analyze the search for Emergency Care (EC) in the Western Health District of Ribeirão Preto (São Paulo), in order to identify the reasons why users turn to these services in situations that are not characterized as urgencies and emergencies. METHODS: A qualitative and descriptive study was undertaken. A guiding script was applied to 23 EC users, addressing questions related to health service accessibility and welcoming, problem solving, reason to visit the EC and care comprehensiveness. RESULTS: The subjects reported that, at the Primary Health Care services, receiving care and scheduling consultations took a long time and that the opening hours of these services coincide with their work hours. At the EC service, access to technologies and medicines was easier. CONCLUSION: Primary health care services have been unable to turn into the entry door to the health system, being replaced by emergency services, putting a significant strain on these services' capacity.
Resumo:
In many countries buildings are responsible for a substantial part of the energy consumption, nd it varies according to their energetic and environmental performances. The potential for major reductions in buildings consumption have bee well documented in Brazil. Opportunities have been identified throughout the life cycle of the buildings, due of projects in diverse locations without the proper adjustments. This article offers a reflection about project processes and how its understanding can be conducted in an integrated way, favoring the use of natural resources and lowering energy consumption. It concludes by indicating that the longest phase in the life cycle of a building is also the phase responsible for its largest energy consumption, not only because of its duration but also for the interaction with the end user. Therefore, in order to harvest the energy cost reduction potential from future buildings designers need a holistic view of the surrounding, end users, materials and methodologies.
Resumo:
Máster Universitario en Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería (SIANI)