708 resultados para Inception


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis attempts a psychological investigation of hemispheric functioning in developmental dyslexia. Previous work using neuropsychological methods with developmental dyslexics is reviewed ,and original work is presented both of a conventional psychometric nature and also utilising a new means of intervention. At the inception of inquiry into dyslexia, comparisons were drawn between developmental dyslexia and acquired alexia, promoting a model of brain damage as the common cause. Subsequent investigators found developmental dyslexics to be neurologically intact, and so an alternative hypothesis was offered, namely that language is abnormally localized (not in the left hemisphere). Research in the last decade, using the advanced techniques of modern neuropsychology, has indicated that developmental dyslexics are probably left hemisphere dominant for language. The development of a new type of pharmaceutical prep~ration (that appears to have a left hemisphere effect) offers an oppertunity to test the experimental hypothesis. This hypothesis propounds that most dyslexics are left hemisphere language dominant, but some of these language related operations are dysfunctioning. The methods utilised are those of psychological assessment of cognitive function, both in a traditional psychometric situation, and with a new form of intervention (Piracetam). The information resulting from intervention will be judged on its therapeutic validity and contribution to the understanding of hemispheric functioning in dyslexics. The experimental studies using conventional psychometric evaluation revealed a dyslexic profile of poor sequencing and name coding ability, with adequate spatial and verbal reasoning skills. Neuropsychological information would tend to suggest that this profile was indicative of adequate right hemsiphere abilities and deficits in some left hemsiphere abilities. When an intervention agent (Piracetam) was used with young adult dyslexics there were improvements in both the rate of acquisition and conservation of verbal learning. An experimental study with dyslexic children revealed that Piracetam appeared to improve reading, writing and sequencing, but did not influence spatial abilities. This would seem to concord with other recent findings, that deve~mental dyslexics may have left hemisphere language localisation, although some of these language related abilities are dysfunctioning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study explores the institutional logic(s) governing the Corporate Internet Reporting (CIR) by Egyptian listed companies. In doing so, a mixed methods approach was followed. The qualitative part seeks to understand the perceptions, believes, values, norms, that are commonly shared by Egyptian companies which engaged in these practices. Consequently, seven cases of large listed Egyptian companies operating in different industries have been examined. Other stakeholders and stockholders have been interviewed in conjunction with these cases. The quantitative part consists of two studies. The first one is descriptive aiming to specify whether the induced logic(s) from the seven cases are commonly embraced by other Egyptian companies. The second study is explanatory aiming to investigate the impact of several institutional and economic factors on the extent of CIR, types of the online information, quality of the websites as well as the Internet facilities. Drawing on prior CIR literature, four potential types of logics could be inferred: efficiency, legitimacy, technical and marketing based logics. In Egypt, legitimacy logic was initially embraced in the earlier years after the Internet inception. latter, companies confronted radical challenges in their internal and external environments which impelled them to raise their websites potentialities to defend their competitive position; either domestically or internationally. Thus, two new logics emphasizing marketing and technical perspectives have emerged, in response. Strikingly, efficiency based logic is not the most prevalent logic driving CIR practices in Egypt as in the developed countries. The empirical results support this observation and show that almost half of Egyptian listed companies 115 as on December 2010 possessed an active website, half of them 62 disclosed part of their financial and accounting information, during December 2010 to February 2011. Less than half of the websites 52 offered latest annual financial statements. Fewer 33(29%) websites provided shareholders and stock information or included a separate section for corporate governance 25 (22%) compared to 50 (44%) possessing a section for news or press releases. Additionally, the variations in CIR practices, as well as timeliness and credibility were also evident even at industrial level. After controlling for firm size, profitability, leverage, liquidity, competition and growth, it was realized that industrial companies and those facing little competition tend to disclose less. In contrast, management size, foreign investors, foreign listing, dispersion of shareholders and firm size provided significant and positive impact individually or collectively. In contrast, neither audit firm, nor most of performance indicators (i.e. profitability, leverage, and liquidity) did exert an influence on the CIR practices. Thus, it is suggested that CIR practices are loosely institutionalised in Egypt, which necessitates issuing several regulative and processional rules to raise the quality attributes of Egyptian websites, especially, timeliness and credibility. Beside, this study highlights the potency of assessing the impact of institutional logic on CIR practices and suggests paying equal attention to the institutional and economic factors when comparing the CIR practices over time or across different institutional environments in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

All-optical signal processing is a powerful tool for the processing of communication signals and optical network applications have been routinely considered since the inception of optical communication. There are many successful optical devices deployed in today’s communication networks, including optical amplification, dispersion compensation, optical cross connects and reconfigurable add drop multiplexers. However, despite record breaking performance, all-optical signal processing devices have struggled to find a viable market niche. This has been mainly due to competition from electro-optic alternatives, either from detailed performance analysis or more usually due to the limited market opportunity for a mid-link device. For example a wavelength converter would compete with a reconfigured transponder which has an additional market as an actual transponder enabling significantly more economical development. Never-the-less, the potential performance of all-optical devices is enticing. Motivated by their prospects of eventual deployment, in this chapter we analyse the performance and energy consumption of digital coherent transponders, linear coherent repeaters and modulator based pulse shaping/frequency conversion, setting a benchmark for the proposed all-optical implementations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Semantic Web has come a long way since its inception in 2001, especially in terms of technical development and research progress. However, adoption by non- technical practitioners is still an ongoing process, and in some areas this process is just now starting. Emergency response is an area where reliability and timeliness of information and technologies is of essence. Therefore it is quite natural that more widespread adoption in this area has not been seen until now, when Semantic Web technologies are mature enough to support the high requirements of the application area. Nevertheless, to leverage the full potential of Semantic Web research results for this application area, there is need for an arena where practitioners and researchers can meet and exchange ideas and results. Our intention is for this workshop, and hopefully coming workshops in the same series, to be such an arena for discussion. The Extended Semantic Web Conference (ESWC - formerly the European Semantic Web conference) is one of the major research conferences in the Semantic Web field, whereas this is a suitable location for this workshop in order to discuss the application of Semantic Web technology to our specific area of applications. Hence, we chose to arrange our first SMILE workshop at ESWC 2013. However, this workshop does not focus solely on semantic technologies for emergency response, but rather Semantic Web technologies in combination with technologies and principles for what is sometimes called the "social web". Social media has already been used successfully in many cases, as a tool for supporting emergency response. The aim of this workshop is therefore to take this to the next level and answer questions like: "how can we make sense of, and furthermore make use of, all the data that is produced by different kinds of social media platforms in an emergency situation?" For the first edition of this workshop the chairs collected the following main topics of interest: • Semantic Annotation for understanding the content and context of social media streams. • Integration of Social Media with Linked Data. • Interactive Interfaces and visual analytics methodologies for managing multiple large-scale, dynamic, evolving datasets. • Stream reasoning and event detection. • Social Data Mining. • Collaborative tools and services for Citizens, Organisations, Communities. • Privacy, ethics, trustworthiness and legal issues in the Social Semantic Web. • Use case analysis, with specific interest for use cases that involve the application of Social Media and Linked Data methodologies in real-life scenarios. All of these, applied in the context of: • Crisis and Disaster Management • Emergency Response • Security and Citizen Journalism The workshop received 6 high-quality paper submissions and based on a thorough review process, thanks to our program committee, the decision was made to accept four of these papers for the workshop (67% acceptance rate). These four papers can be found later in this proceedings volume. Three out of four of these papers particularly discuss the integration and analysis of social media data, using Semantic Web technologies, e.g. for detecting complex events in social media streams, for visualizing and analysing sentiments with respect to certain topics in social media, or for detecting small-scale incidents entirely through the use of social media information. Finally, the fourth paper presents an architecture for using Semantic Web technologies in resource management during a disaster. Additionally, the workshop featured an invited keynote speech by Dr. Tomi Kauppinen from Aalto university. Dr. Kauppinen shared experiences from his work on applying Semantic Web technologies to application fields such as geoinformatics and scientific research, i.e. so-called Linked Science, but also recent ideas and applications in the emergency response field. His input was also highly valuable for the roadmapping discussion, which was held at the end of the workshop. A separate summary of the roadmapping session can be found at the end of these proceedings. Finally, we would like to thank our invited speaker Dr. Tomi Kauppinen, all our program committee members, as well as the workshop chair of ESWC2013, Johanna Völker (University of Mannheim), for helping us to make this first SMILE workshop a highly interesting and successful event!

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of the study was to provide a historical record of the Bureau of Jewish Education/Central Agency for Jewish Education and its role in Jewish education in Miami since its inception in 1944 as well as to provide a sociological context within which to view the growth and development of the community. During the past 50 years of the Agency's existence, Dade County's Jewish population has undergone many changes including a huge population increase in the 1960s and 1970s and then a decrease in the 1980s and 1990s, and a shift from postwar business class of store owners to turn of the century professional class.^ The methodology used in this study was threefold. First, document analysis of formal and informal documents dating from 1944 to the present was conducted. Second, personal interviews were conducted with the Executive Directors of the B.J.E./C.A.J.E., long-time B.J.E./C.A.J.E. staff, present staff, Greater Miami Jewish Federation leaders, and lay leadership of C.A.J.E. Third, national trends in Jewish education were cited as a basis for the comparison and contrast of the achievements of C.A.J.E.^ The historiography concluded that the Agency had come full circle in its programs. Analysis of the services provided to religious and day schools, early childhood education, the High Schools, teacher services, adult education, and the library indicated that in some areas C.A.J.E. was an innovator, in other areas it followed national trends, and in others it was deficient. Recommendations included a reeducative process for the community with Jewish education made top priority, more visibility and publicity for the work of C.A.J.E. that would enhance its prestige and improve support, and holistic planning of programs for the future. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Profiles that ran in the New Yorker magazine from its inception in 1925 are valuable examples of literary journalism and are of lasting value to scholars both as texts and as cultural artifacts. This thesis assembles a bibliography of all Profiles that appeared under the magazine's first three editors, with some explanation of their importance to biographical and literary studies. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Cuban military involvement in Angola has often been seen as a response to the wishes of the former Soviet Union. Yet, Castro intervened in Angola following his theory of internationalism. Internationalism, as conceived by Castro, sent Cubans on a voluntary basis to serve abroad, either in the military or in the civilian sector. This thesis will illustrate that from its inception, Castro sent military troops to Angola to divert domestic concerns and to boost Cuba's alliances throughout the world. Angola is different from other internationalist missions, because in Angola--for the first time--regular combat troops were used. Castro intervened in Angola to prevent a collapse of the Moviemento Popular de Libertacao de Angola (MPLA) government, and stayed on to ensure the viability of the MPLA. The primary sources are interviews conducted by the author, of participants in the Angolan civil war. The secondary sources consulted are works on Cuba, Southern Africa, Portuguese colonialism and Angola. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Republic of South Africa since the 1948 inception of Apartheid policies has experienced economic problems resulting from spatially dispersed growth. The election of President Mandela in 1994, however, eliminated the last forms of Apartheid as well as its discriminatory spatial, social, and economic policies, specially toward black Africans. In Cape Town, South Africa, several initiatives to restructure and to economically revitalize blighted and abandoned township communities, like Langa, have been instituted. One element of this strategy is the development of activity streets. The main questions asked in this study are whether activity streets are a feasible solution to the local economic problems left by the apartheid system and whether activity streets represent an economically sustainable approach to development. An analysis of a proposed activity street in Langa and its potential to generate jobs is undertaken. An Employment Generation Model used in this study shows that many of the businesses rely on the local purchasing power of the residents. Since the economic activities are mostly service oriented, a combination of manufacturing industries and institutionally implemented strategies within the township will have to be developed in order to generate sustainable employment. The result seem to indicate that, in Langa, the activity street depend very much on an increase in sales, pedestrian and vehicular traffic flow. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In her discussion - The Tax Reform Act Of 1986: Impact On Hospitality Industries - by Elisa S. Moncarz, Associate Professor, the School of Hospitality Management at Florida International University, Professor Moncarz initially states: “After nearly two years of considering the overhaul of the federal tax system, Congress enacted the Tax Reform Act of 1986. The impact of this legislation is expected to affect virtually all individuals and businesses associated with the hospitality industry. This article discusses some of the major provisions of the tax bill, emphasizing those relating to the hospitality service industries and contrasting relevant provisions with prior law on their positive and negative effects to the industry. “On October 22, 1986, President Reagan signed the Tax Reform Act of 1986 (TRA 86) with changes so pervasive that a recodification of the income tax laws became necessary…,” Professor Moncarz says in providing a basic history of the bill. Two, very important paragraphs underpin TRA 86, and this article. They should not be under-estimated. The author wants you to know: “With the passage of TRA 86, the Reagan administration achieved the most important single domestic initiative of Reagan's second term, a complete restructuring of the federal tax system in an attempt to re-establish fairness in the tax code…,” an informed view, indeed. “These changes will result in an estimated shift of over $100 billion of the tax burden from individuals to corporations over the next five years [as of this article],” Professor Moncarz enlightens. “…TRA 86 embraces a conversion to the view that lowering tax rates and eliminating or restricting tax preferences (i.e., loopholes) “would be more economically and socially productive.” Hence, economic decisions would be based on economic efficiency as opposed to tax effect,” the author asserts. “…both Congress and the administration recognized from its inception that the reform of the tax code must satisfy three basic goals,” and these goals are identified for you. Professor Moncarz outlines the positive impact TRA 86 will have on the U.S. economy in general, but also makes distinctions the ‘Act will have on specific segments of the business community, with a particular eye toward the hospitality industry and food-service in particular. Professor Moncarz also provides graphs to illustrate the comparative tax indexes of select companies, encompassing the years 1883-through-1985. Deductibility and its importance are discussed as well. The author foresees Limited Partnerships, employment, and even new hotel construction and/or rehabilitation being affected by TRA 86. The article, as one would assume from this type of discussion, is liberally peppered with facts and figures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background A subgroup has emerged within the obese that do not display the typical metabolic disorders associated with obesity and are hypothesized to have lower risk of complications. The purpose of this review was to analyze the literature which has examined the burden of cardiovascular disease (CVD) and all-cause mortality in the metabolically healthy obese (MHO) population. Methods Pubmed, Cochrane Library, and Web of Science were searched from their inception until December 2012. Studies were included which clearly defined the MHO group (using either insulin sensitivity and/or components of metabolic syndrome AND obesity) and its association with either all cause mortality, CVD mortality, incident CVD, and/or subclinical CVD. Results A total of 20 studies were identified; 15 cohort and 5 cross-sectional. Eight studies used the NCEP Adult Treatment Panel III definition of metabolic syndrome to define “metabolically healthy”, while another nine used insulin resistance. Seven studies assessed all-cause mortality, seven assessed CVD mortality, and nine assessed incident CVD. MHO was found to be significantly associated with all-cause mortality in two studies (30%), CVD mortality in one study (14%), and incident CVD in three studies (33%). Of the six studies which examined subclinical disease, four (67%) showed significantly higher mean common carotid artery intima media thickness (CCA-IMT), coronary artery calcium (CAC), or other subclinical CVD markers in the MHO as compared to their MHNW counterparts. Conclusions MHO is an important, emerging phenotype with a CVD risk between healthy, normal weight and unhealthy, obese individuals. Successful work towards a universally accepted definition of MHO would improve (and simplify) future studies and aid inter-study comparisons. Usefulness of a definition inclusive of insulin sensitivity and stricter criteria for metabolic syndrome components as well as the potential addition of markers of fatty liver and inflammation should be explored. Clinicians should be hesitant to reassure patients that the metabolically benign phenotype is safe, as increased risk cardiovascular disease and death have been shown.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software engineering researchers are challenged to provide increasingly more powerful levels of abstractions to address the rising complexity inherent in software solutions. One new development paradigm that places models as abstraction at the forefront of the development process is Model-Driven Software Development (MDSD). MDSD considers models as first class artifacts, extending the capability for engineers to use concepts from the problem domain of discourse to specify apropos solutions. A key component in MDSD is domain-specific modeling languages (DSMLs) which are languages with focused expressiveness, targeting a specific taxonomy of problems. The de facto approach used is to first transform DSML models to an intermediate artifact in a HLL e.g., Java or C++, then execute that resulting code.^ Our research group has developed a class of DSMLs, referred to as interpreted DSMLs (i-DSMLs), where models are directly interpreted by a specialized execution engine with semantics based on model changes at runtime. This execution engine uses a layered architecture and is referred to as a domain-specific virtual machine (DSVM). As the domain-specific model being executed descends the layers of the DSVM the semantic gap between the user-defined model and the services being provided by the underlying infrastructure is closed. The focus of this research is the synthesis engine, the layer in the DSVM which transforms i-DSML models into executable scripts for the next lower layer to process.^ The appeal of an i-DSML is constrained as it possesses unique semantics contained within the DSVM. Existing DSVMs for i-DSMLs exhibit tight coupling between the implicit model of execution and the semantics of the domain, making it difficult to develop DSVMs for new i-DSMLs without a significant investment in resources.^ At the onset of this research only one i-DSML had been created for the user- centric communication domain using the aforementioned approach. This i-DSML is the Communication Modeling Language (CML) and its DSVM is the Communication Virtual machine (CVM). A major problem with the CVM's synthesis engine is that the domain-specific knowledge (DSK) and the model of execution (MoE) are tightly interwoven consequently subsequent DSVMs would need to be developed from inception with no reuse of expertise.^ This dissertation investigates how to decouple the DSK from the MoE and subsequently producing a generic model of execution (GMoE) from the remaining application logic. This GMoE can be reused to instantiate synthesis engines for DSVMs in other domains. The generalized approach to developing the model synthesis component of i-DSML interpreters utilizes a reusable framework loosely coupled to DSK as swappable framework extensions.^ This approach involves first creating an i-DSML and its DSVM for a second do- main, demand-side smartgrid, or microgrid energy management, and designing the synthesis engine so that the DSK and MoE are easily decoupled. To validate the utility of the approach, the SEs are instantiated using the GMoE and DSKs of the two aforementioned domains and an empirical study to support our claim of reduced developmental effort is performed.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Archival research was conducted on the inception of preemployment psychological testing, as part of the background screening process, to select police officers for a local police department. Various issues and incidents were analyzed to help explain why this police department progressed from an abbreviated version of a psychological battery, to a much more sophisticated and comprehensive set of instruments. While doubts about psychological exams do exist, research has shown that many are valid and reliable in predicting job performance of police candidates. During a three year period, a police department hired 162 candidates (133 males and 29 females) who received "acceptable" psychological ratings and 71 candidates (58 males and 13 females) who received "marginal" psychological ratings. A document analysis consisted of variables that have been identified as job performance indicators which police psychological testing tries to predict, and "screen in" or "screen out" appropriate applicants. The areas of focus comprised the 6-month police academy, the 4-month Field Training Officer (FTO) Program, the remaining probationary period, and yearly performance up to five years of employment. Specific job performance variables were the final academy grade average, supervisors' evaluation ratings, reprimands, commendations, awards, citizen complaints, time losses, sick time usage, reassignments, promotions, and separations. A causal-comparative research design was used to determine if there were significant statistical differences in these job performance variables between police officers with "acceptable" psychological ratings and police officers with "marginal" psychological ratings. The results of multivariate analyses of variance, t-tests, and chi-square procedures as applicable, showed no significant differences between the two groups on any of the job performance variables.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Last Interglacial (LIG, 129-116 thousand of years BP, ka) represents a test bed for climate model feedbacks in warmer-than-present high latitude regions. However, mainly because aligning different palaeoclimatic archives and from different parts of the world is not trivial, a spatio-temporal picture of LIG temperature changes is difficult to obtain. Here, we have selected 47 polar ice core and sub-polar marine sediment records and developed a strategy to align them onto the recent AICC2012 ice core chronology. We provide the first compilation of high-latitude temperature changes across the LIG associated with a coherent temporal framework built between ice core and marine sediment records. Our new data synthesis highlights non-synchronous maximum temperature changes between the two hemispheres with the Southern Ocean and Antarctica records showing an early warming compared to North Atlantic records. We also observe warmer than present-day conditions that occur for a longer time period in southern high latitudes than in northern high latitudes. Finally, the amplitude of temperature changes at high northern latitudes is larger compared to high southern latitude temperature changes recorded at the onset and the demise of the LIG. We have also compiled four data-based time slices with temperature anomalies (compared to present-day conditions) at 115 ka, 120 ka, 125 ka and 130 ka and quantitatively estimated temperature uncertainties that include relative dating errors. This provides an improved benchmark for performing more robust model-data comparison. The surface temperature simulated by two General Circulation Models (CCSM3 and HadCM3) for 130 ka and 125 ka is compared to the corresponding time slice data synthesis. This comparison shows that the models predict warmer than present conditions earlier than documented in the North Atlantic, while neither model is able to produce the reconstructed early Southern Ocean and Antarctic warming. Our results highlight the importance of producing a sequence of time slices rather than one single time slice averaging the LIG climate conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the inception of the international GEOTRACES program, studies investigating the distribution of trace elements and their isotopes in the global ocean have significantly increased. In spite of this large-scale effort, the distribution of neodymium isotopes (143Nd/144Nd) and concentrations ([Nd]) in the high latitude south Pacific is still understudied. Here we report dissolved Nd isotopes and concentrations from 11 vertical water column profiles from the south Pacific between South America and New Zealand. Results suggest that Ross Sea Bottom Water (RSBW) is represented by an epsilon-Nd value of ~ -7, and is thus more radiogenic than Circumpolar Deep Water (epsilon-Nd ~ -8). RSBW and its characteristic epsilon-Nd signature can be traced far into the SE Pacific until progressive mixing with ambient Lower Circumpolar Deep water (LCDW) dilutes this signal north of the Antarctic Polar Front (APF). The SW-NE trending Pacific-Antarctic Ridge restricts the advection of RSBW into the SW Pacific, where bottom water density, salinity, and epsilon-Nd values of -9 indicate the presence of bottom waters of an origin different from the Ross Sea. Neodymium concentrations show low surface concentrations and a linear increase with depth north of the Polar Front. South of the APF, surface [Nd] is high and increases with depth but remains almost constant below ~1000 m. This vertical and spatial [Nd] pattern follows the southward shoaling density surfaces of the Southern Ocean frontal system and hence suggests supply of Nd to the upper ocean through upwelling of Nd-rich deep water. Low particle abundance dominated by reduced opal production and seasonal sea ice cover likely contributes to the maintenance of the high upper ocean [Nd] south of the APF. The reported data highlights the use of Nd isotopes as a water mass tracer in the Southern Ocean, with the potential for paleocenaographic reconstructions, and contributes to an improved understanding of Nd biogeochemistry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aimed to build a virtual learning environment for application of the nursing process based on the NANDA-I, NOC, NIC and ICNP® . Faced with problems related to learning of the nursing process and classifications, there is an urgent need to develop innovative teaching resources that modify the relationship between students and teachers. The methodology was based on the steps inception, development, construction and transition, and the software development process Rational Process Unifield. The team involved in the development of this environment was composed by researchers and students of The Care and Epidemiological Practice in Health and Nursing and Group of the Software Engineering curse of the Federal University Rio Grande do Norte, with the participation of the Lisbon and Porto Schools of Nursing, in Portugal. In the inception stage the inter research communication was in order to define the functions, features and tools for the construction process. In the preparation, step the planning and modeling occurred, which resulted in the creation of a diagram and a architectural drawings that specify the features and functionality of the software. The development, unit testing and integrated in interfaces of the modules and areas (administrator, teacher, student, and construction of the NP). Then the transition step was performed, which showed complete and functioning system, as well as the training and use by researchers with its use in practice. In conclusion, this study allowed for the planning and the construction of an educational technology, and it is expected that its implementation will trigger a substantial change in the learning of the nursing process and classifications, with the student being active agent of the learning process. Later, an assessment will be made of functional performance, which will enable the software development, with a feedback, correction of defects and necessary changes. It is believed that the software increment after the reviews, this tool grow further and help insert this methodology and every language under the educational and health institutions, promoting paradigmatic desired change by nursing.