935 resultados para Methodological proposal
Resumo:
Nowadays software testing and quality assurance have a great value in software development process. Software testing does not mean a concrete discipline, it is the process of validation and verification that starts from the idea of future product and finishes at the end of product’s maintenance. The importance of software testing methods and tools that can be applied on different testing phases is highly stressed in industry. The initial objectives for this thesis were to provide a sufficient literature review on different testing phases and for each of the phases define the method that can be effectively used for improving software’s quality. Software testing phases, chosen for study are: unit testing, integration testing, functional testing, system testing, acceptance testing and usability testing. The research showed that there are many software testing methods that can be applied at different phases and in the most of the cases the choice of the method should be done depending on software type and its specification. In the thesis the problem, concerned to each of the phases was identified; the method that can help in eliminating this problem was suggested and particularly described.
Resumo:
Industrial applications demand that robots operate in agreement with the position and orientation of their end effector. It is necessary to solve the kinematics inverse problem. This allows the displacement of the joints of the manipulator to be determined, to accomplish a given objective. Complete studies of dynamical control of joint robotics are also necessary. Initially, this article focuses on the implementation of numerical algorithms for the solution of the kinematics inverse problem and the modeling and simulation of dynamic systems. This is done using real time implementation. The modeling and simulation of dynamic systems are performed emphasizing off-line programming. In sequence, a complete study of the control strategies is carried out through the study of several elements of a robotic joint, such as: DC motor, inertia, and gearbox. Finally a trajectory generator, used as input for a generic group of joints, is developed and a proposal of the controller's implementation of joints, using EPLD development system, is presented.
Resumo:
A new concept termed "radioautographology" is advocated. This term was synthesized from "radioautography" and "ology", expressing a new science derived from radioautography. The concept of radioautographology (RAGology) is that of a science whose objective is to localize radioactive substances in the biological structure of objects and to analyze and study the significance of these substances in the biological structure. On the other hand, the old term radioautography (RAG) is the technique used to demonstrate the pattern of localization of various radiolabeled compounds in specimens. The specimens used in biology and medicine are cells and tissues. They are fixed, sectioned and placed in contact with the radioautographic emulsions, which are exposed and developed to produce metallic silver grains. Such specimens are designated as radioautographs and the patterns of pictures made of silver grains are named radioautograms. The technicians who produce radioautographs are named radioautographers, while those who study RAGology are scientists and should be called radioautographologists. The science of RAGology can be divided into two parts, general RAGology and special RAGology, as most natural sciences usually can. General RAGology is the technology of RAG which consists of three fields of science, i.e., physics concerning radioactivity, histochemistry for the treatment of cells and tissues, and photochemistry dealing with the photographic emulsions. Special RAGology, on the other hand, consists of applications of general RAGology. The applications can be classified into several scientific fields, i.e., cellular and molecular biology, anatomy, histology, embryology, pathology and pharmacology. Studies carried out in our laboratory are summarized and reviewed. All the results obtained from such applications should be systematized as a new field of science in the future.
Resumo:
In the present study we standardized an experimental model of parabiotic circulation of isolated pig heart. The isolated heart was perfused with arterial blood from a second animal as support and submitted to regional ischemia for 30 min, followed by total ischemia for 90 min and reperfusion for 90 min. Parameters for measurement of ventricular performance using different indices measured directly or indirectly from intraventricular pressure were defined as: maximum peak pressure, final diastolic pressure, pressure developed, first derivative of maximum pressure (dP/dt max), first derivative of minimum pressure (dP/dt min), systolic stress of the left ventricle (sigmas), and maximum elastance of the left ventricle. Isolated hearts subjected to regional and global ischemia presented significant worsening of all measured parameters. Less discriminative parameters were dP/dt max and dP/dt min. Elastance was the most sensitive parameter during the reperfusion period, demonstrating an early loss of ventricular function during reperfusion. The model proved to be stable and reproducible and permitted the study of several variables in the isolated heart, such as ischemia and reperfusion phenomena, the effects of different drugs, surgical interventions, etc. The model introduces an advantage over the classical models which use crystalloid solutions as perfusate, because parabiotic circulation mimics heart surgery with extracorporeal circulation.
Resumo:
We present a critical analysis of the generalized use of the "impact factor". By means of the Kruskal-Wallis test, it was shown that it is not possible to compare distinct disciplines using the impact factor without adjustments. After assigning the median journal the value of one (1.000), the impact factor value for each journal was calculated by the rule of three. The adjusted values were homogeneous, thus permitting comparison among distinct disciplines.
Resumo:
Single-photon emission computed tomography (SPECT) is a non-invasive imaging technique, which provides information reporting the functional states of tissues. SPECT imaging has been used as a diagnostic tool in several human disorders and can be used in animal models of diseases for physiopathological, genomic and drug discovery studies. However, most of the experimental models used in research involve rodents, which are at least one order of magnitude smaller in linear dimensions than man. Consequently, images of targets obtained with conventional gamma-cameras and collimators have poor spatial resolution and statistical quality. We review the methodological approaches developed in recent years in order to obtain images of small targets with good spatial resolution and sensitivity. Multipinhole, coded mask- and slit-based collimators are presented as alternative approaches to improve image quality. In combination with appropriate decoding algorithms, these collimators permit a significant reduction of the time needed to register the projections used to make 3-D representations of the volumetric distribution of target’s radiotracers. Simultaneously, they can be used to minimize artifacts and blurring arising when single pinhole collimators are used. Representation images are presented, which illustrate the use of these collimators. We also comment on the use of coded masks to attain tomographic resolution with a single projection, as discussed by some investigators since their introduction to obtain near-field images. We conclude this review by showing that the use of appropriate hardware and software tools adapted to conventional gamma-cameras can be of great help in obtaining relevant functional information in experiments using small animals.
Resumo:
Sarcopenic obesity is the combination of reduced fat-free mass (FFM) and increased fat mass (FM) with advancing age but there is lack of clear criteria for its identification. The purposes of the present investigation were: 1) to determine the prevalence of postmenopausal women with reduced FFM relative to their FM and height, and 2) to examine whether there are associations between the proposed classification and health-related variables. A total of 607 women were included in this cross-sectional study and were separated into two subsets: 258 older women with a mean age of 66.8 ± 5.6 years and 349 young women aged 18-40 years (mean age, 29.0 ± 7.5 years). All volunteers underwent body composition assessment by dual-energy X-ray absorptiometry. The FFM index relative to FM and height was calculated and the cutoff value corresponded to two standard deviations below the mean of the young reference group. To examine the clinical significance of the classification, all older participants underwent measurements of quadriceps strength and cardiorespiratory fitness. Values were compared between those who were classified as low FFM or not, using an independent samples t-test and correlations were examined. The cutoff corresponded to a residual of -3.4 and generated a sarcopenic obesity prevalence of 19.8% that was associated with reduced muscle strength and aerobic fitness among the older participants. Also, the index correlated significantly with the health-related fitness variables. The results demonstrated reduced functional capacity for those below the proposed cutoff and suggested applicability of the approach as a definition for sarcopenic obesity.
Resumo:
Mixed flavor beverages represent a trend that is gaining the allegiance of potential fruit juice consumers. The present study proposed to prepare mixed flavor beverages and verify their consumer acceptance. Cajá beverage (sample A) was used as the standard. The other beverages were prepared by mixing the cajá-flavored product with other flavors: strawberry (B), pineapple (C), jabuticaba (D), mango (E) and cashew (F). The consumer profiles in the two regions studied were similar. Overall beverages B, A and F were the most accepted, with scores of 7.7, 6.4 and 6.2, respectively. Internal Preference Mapping showed that most of the consumers were located near beverages A, B and F, confirming the acceptance results. The consumers indicated appearance and flavor as the most appreciated characteristics in beverages A, B and F. Beverages A, B and F presented higher total soluble solids contents and viscosities than the other beverages. Consumer segmentation did not depend on the different levels of familiarity with the cajá flavor. Thus the preparation of mixed flavor beverages of cajá-strawberry and cajá-cashew is an excellent proposal because it presents flavors with good potential for marketing in different regions of Brazil.
Resumo:
One of the merits of contemporary economic analysis is its capacity to offer accounts of choice behavior that dispense with details of the complex decision machinery. The starting point of this paper is the concern with the important methodological debate about whether economics might offer accurate predictions and explanations of actual behavior without any reference to psychological presuppositions. Inspired by an exercise of rational reconstruction of ideas, I aim to offer an interpretation of the process of freeing economic analysis from psychology at the end of the 19th century and the contemporary resurrection of behavioral approaches in the late 1980s.
Resumo:
The article analyses the current process of economic integration in South America. Thus, concentrating our attention on the UNASUR regional integration process, two questions arise: First, is UNASUR the most viable institution to achieve a consistent economic integration process in South America? Second, what model of economic integration should be adopted in the case of UNASUR, which would ensure macroeconomic stability and avoid financial and exchange rate crises in the South America? To answer these questions, the article proposes, based on the Keynes (1944/1980)'s revolutionary analysis presented in his International Clearing Union, during the Bretton Woods Conference in 1944, a regional arrangement to UNASUR.
Resumo:
This paper examines the structuralist tradition in economics, emphasizing the role that structures play in the economic growth of developing countries. Since the subject at hand is evidently too large to cover in a single article, an emphasis has been brought to bear upon the macroeconomic elements of such a tradition, while also exploring its methodological aspects. It begins by analysing some general aspects of structuralism in economics (its evolution and origins) associated with ECLAC thought, in this instance focusing on the dynamics of the center-periphery relationship. Thereafter, the macroeconomic structuralism derived from the works of Taylor (1983, 1991) is presented, followed by a presentation of neo-structuralism. Centred on the concept of systemic competitiveness, this approach defines a strategy to achieve the high road of globalization, understood here as an inevitable process in spite of its engagement being dependent on the policies adopted. The conclusions show the genuine contributions of this tradition to economic theory.
Resumo:
Numerous definitions of forgiveness have been proposed in the literature (e.g.. North, 1987; Enright, Freedman & Rique, 1998), most ofwhich are based on religious or philosophical notions, rather than on empirical evidence. Definitions employed by researchers have typically set very high standards for forgiveness. This research was designed to investigate the possibility that these definitions describe an ideal of forgiveness and may not reflect laypersons' beliefe and experiences. Using Higgins' Self-Discrepancy Theory as a fiamework, three types of forgiveness beliefs were investigated: actual, ideal, and ought Q-methodology (which permits intensive study ofphenomena in small samples) was employed to examine and compare participants' beliefs about forgiveness across these domains. Thirty participants (20 women), 25 to 78 years of age, were recruited firom the community. They were asked to sort a set of66 statements about forgiveness according to their level of agreement with each statement This process was repeated three times, with the goal of modelling participants' actual experiences, their ideals, and how they believed forgiveness ought to be. Three perspectives on forgiveness emerged across the domains: forgiveness as motivated by religious beliefs, reconciliation-focussed forgiveness, and conflicted forgiveness. These perspectives indicated that, for many participants, the definitions presented in the literature may coincide with their beliefs about how forgiveness would ideally be and should be, as well as with their experiences of forgiveness; however, a large number of participants' experiences of, and beliefs about, forgiveness do not conform to the standards set out in the literature, and to exclude these participants' experiences and beliefs would mean overlooking what forgiveness means to a large portion of people. Results of this study indicate that researchers need to keep an open mind about what forgiveness may mean to their participants.
Resumo:
Q-methodology permitted 41 people to communicate their perspective of grief. In an attempt to clarify the research to date and to allow those who have experienced this human journey to direct the scientists, 80 statements were chosen to present to the participants based on the research from academic and counselling sources. Five different perspectives emerged from the Q-sorts and factor analysis. Each perspective was valuable for the understanding of different groups of mourners. They were interpreted using questionnaire data and interview information. They are as follows: Factor 1- Growth Optimism; Factor 2 - Schema Destruction and Negative Affect; Factor 3- Identification with the Deceased Person; Factor 4- Intact World view with High Clarity and High Social Support; Factor 5- Schema Destruction with High Preoccupation and Attention to Emotion. Some people grow in the face of grief, others hold on to essentially the same schemas and others are devastated by their loss. The different perspectives reported herein supply clues to the sources of these differing outcomes. From examination of Factor 1, it appears that a healthy living relationship helps substantially in the event of loss. An orientation toward emotions that encourages clarity, exemplified by Factor 4, without hyper-vigilance to emotion may be helpful as well. Strategies for maintaining schematic representations of the world with little alteration include: identification with the values of the deceased person, as in Factor 3 and reliance on social support and/or God as demonstrated by Factor 4. When the relationship had painful periods, social support may be accessed to benefit some mourners. When the person's frame of reference or higher order schemas are assaulted by the events of loss, the people most at risk for traumatic grief seem to be those with difficult relationships as indicated by Factor 5 individuals. When low social support, high attention to emotion with low clarity and little belief that feelings can be altered for the better are also attributes of the mourner devastating grief can result. In the end, there are groups of people who are forced to endure the entire process of schema destruction and devastation. Some appear to recover in part and others appear to stay in a form of purgatory for many years. The results of this study suggest that, those who experience devastating grief may be in the minority. In the future interventions could be more specifically addressed if these perspectives are replicated in a larger, more detailed study.
Resumo:
This thesis explored early literacy development in young vulnerable readers. More specifically, this thesis examined an emergent literacy program called Reading Rocks Junior offered by the Learning Disabilities Association of Niagara Region to children four- to six-years of age living in low socioeconomic status communities. Three methodologies were combined to create a rich and complete picture of an effective and accessible literacy program. First of all, a description of the Reading Rocks Junior program is outlined. Secondly, quantitative data that was collected pre- and post- program was analyzed to demonstrate achievement gains made as a result of participating in the program. Finally, qualitative interviews with the program coordinator, the convener of the agency that funded Reading Rocks Junior and three parents whose children participated in the program were analyzed to determine the contextual factors that make Reading Rocks Junior a success.