958 resultados para Operadores de flexibilidade


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Hiker Dice was a game recently proposed in a software designed by Mara Kuzmich and Leonardo Goldbarg. In the game a dice is responsible for building a trail on an n x m board. As the dice waits upon a cell on the board, it prints the side that touches the surface. The game shows the Hamiltonian Path Problem Simple Maximum Hiker Dice (Hidi-CHS) in trays Compact Nth , this problem is then characterized by looking for a Hamiltonian Path that maximize the sum of marked sides on the board. The research now related, models the problem through Graphs, and proposes two classes of solution algorithms. The first class, belonging to the exact algorithms, is formed by a backtracking algorithm planed with a return through logical rules and limiting the best found solution. The second class of algorithms is composed by metaheuristics type Evolutionary Computing, Local Ramdomized search and GRASP (Greed Randomized Adaptative Search). Three specific operators for the algorithms were created as follows: restructuring, recombination with two solutions and random greedy constructive.The exact algorithm was teste on 4x4 to 8x8 boards exhausting the possibility of higher computational treatment of cases due to the explosion in processing time. The heuristics algorithms were tested on 5x5 to 14x14 boards. According to the applied methodology for evaluation, the results acheived by the heuristics algorithms suggests a better performance for the GRASP algorithm

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reconhecendo a exposição a riscos à saúde a que adolescentes obesos estão submetidos e a necessidade da produção de estudos, os quais relacionem excesso de peso corporal e capacidade física durante a puberdade, configura-se como objetivo da presente comunicação explorar as relações entre o sobrepeso e a aptidão física dos alunos de ensino fundamental e médio de escola particular paulista. Trata-se de estudo de coorte, considerando como variável independente o índice de massa corporal (IMC) e como dependentes a resistência muscular localizada (RML) abdominal, força de membros inferiores (MMII), flexibilidade e agilidade. Os dados de interesse foram colhidos em duas oportunidades (no início de dois anos letivos subseqüentes), segundo testes específicos. Foi realizada análise multivariada dos perfis médios, complementada com a construção dos intervalos de confiança simultâneos, ao nível de 5% de significância. Constatou-se prevalência de 21,42% de sobrepeso e associação deste com: i) em meninos, menores RML abdominal e força de MMII e não evolução nas médias de agilidade; ii) em meninas, menor RML abdominal e agilidade, no início do período considerado. Ao se compararem os resultados entre os dois momentos de avaliação, notou-se que: i) a agilidade, das meninas com peso corporal adequado diminuiu significantemente; ii) a RML abdominal, mesmo do grupo masculino com sobrepeso, aumentou; e iii) a agilidade, somente em meninos com IMC inferior, melhorou. Verificaram-se, também entre o sexo masculino, médias de RML abdominal, força de MMII e agilidade significantemente maiores que as do grupo feminino. Nesse sentido, revela-se que a prescrição da atividade física para adolescentes deve realmente ser específica segundo composição corporal e sexo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aims at understanding how sociocultural adjustment occurs in the case of managers, and their spouses, expatriated to Brazil by private and public Spanish organizations. To do so, it adopts as main theoretical frame the expatriate adaptation model proponed by Parker & McEvoy (1993), based on Black, Mendenhall & Oddou s model (1991), which establishes three dimensions of adaptation: adjustment to work; adjustment to general environment and adjustment to interaction with host country nationals. This work, of exploratory and descriptive nature, used field research to gather primary data subsequently analyzed with a qualitative approach. Data collection came from individual in-depth interviews with three Spanish managers expatriated in Brazil and two of their spouses. Resulting data were analyzed through one of content analysis procedures, thematic analysis. This research shows that adjustment is obstructed by cultural distance or cultural novelty rather than by work role characteristics, being more successful in expatriates that carry previous solid sociocultural knowledge about host country. It also verifies that the degree of expatriate adjustment is enhanced by the comprehension of cultural differences that originate values and behaviors different from those of the expatriate. It points out that individual factors such as perception and relation skills, flexibility, empathy and self-efficacy are positively linked to the three dimensions of adjustment: work, general adjustment and interaction adjustment. It finds expatriate adjustment to be lowered by spouse unsuccessful adjustment and shows that location in an environment perceived as short in key infrastructures is negatively linked to adjustment in expatriates coming from strongly urban environments. It concludes that expatriate adjustment occurs through progressive understanding of host country environment and through comprehension of the sociocultural context that explains differences between host country behaviors and values and those from the country of origin, a process which is favored by expatriate individual characteristics not directly linked to his/her technical qualification, such as perception and relation skills, flexibility and empathy, together with solid sociocultural knowledge about the host country. This research propones, therefore, that organizations involved in expatriation processes should include in their selection criteria the degree to which candidates possess personal characteristics and sociocultural knowledge that may facilitate adaptation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In survival analysis, the response is usually the time until the occurrence of an event of interest, called failure time. The main characteristic of survival data is the presence of censoring which is a partial observation of response. Associated with this information, some models occupy an important position by properly fit several practical situations, among which we can mention the Weibull model. Marshall-Olkin extended form distributions other a basic generalization that enables greater exibility in adjusting lifetime data. This paper presents a simulation study that compares the gradient test and the likelihood ratio test using the Marshall-Olkin extended form Weibull distribution. As a result, there is only a small advantage for the likelihood ratio test

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In general, an inverse problem corresponds to find a value of an element x in a suitable vector space, given a vector y measuring it, in some sense. When we discretize the problem, it usually boils down to solve an equation system f(x) = y, where f : U Rm ! Rn represents the step function in any domain U of the appropriate Rm. As a general rule, we arrive to an ill-posed problem. The resolution of inverse problems has been widely researched along the last decades, because many problems in science and industry consist in determining unknowns that we try to know, by observing its effects under certain indirect measures. Our general subject of this dissertation is the choice of Tykhonov´s regulaziration parameter of a poorly conditioned linear problem, as we are going to discuss on chapter 1 of this dissertation, focusing on the three most popular methods in nowadays literature of the area. Our more specific focus in this dissertation consists in the simulations reported on chapter 2, aiming to compare the performance of the three methods in the recuperation of images measured with the Radon transform, perturbed by the addition of gaussian i.i.d. noise. We choosed a difference operator as regularizer of the problem. The contribution we try to make, in this dissertation, mainly consists on the discussion of numerical simulations we execute, as is exposed in Chapter 2. We understand that the meaning of this dissertation lays much more on the questions which it raises than on saying something definitive about the subject. Partly, for beeing based on numerical experiments with no new mathematical results associated to it, partly for being about numerical experiments made with a single operator. On the other hand, we got some observations which seemed to us interesting on the simulations performed, considered the literature of the area. In special, we highlight observations we resume, at the conclusion of this work, about the different vocations of methods like GCV and L-curve and, also, about the optimal parameters tendency observed in the L-curve method of grouping themselves in a small gap, strongly correlated with the behavior of the generalized singular value decomposition curve of the involved operators, under reasonably broad regularity conditions in the images to be recovered

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to make this document self-contained, we first present all the necessary theory as a background. Then we study several definitions that extended the classic bi-implication in to the domain of well stablished fuzzy logics, namely, into the [0; 1] interval. Those approaches of the fuzzy bi-implication can be summarized as follows: two axiomatized definitions, which we proved that represent the same class of functions, four defining standard (two of them proposed by us), which varied by the number of different compound operators and what restrictions they had to satisfy. We proved that those defining standard represent only two classes of functions, having one as a proper subclass of the other, yet being both a subclass of the class represented by the axiomatized definitions. Since those three clases satisfy some contraints that we judge unnecessary, we proposed a new defining standard free of those restrictions and that represents a class of functions that intersects with the class represented by the axiomatized definitions. By this dissertation we are aiming to settle the groundwork for future research on this operator.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Atualmente, há diferentes definições de implicações fuzzy aceitas na literatura. Do ponto de vista teórico, esta falta de consenso demonstra que há discordâncias sobre o real significado de "implicação lógica" nos contextos Booleano e fuzzy. Do ponto de vista prático, isso gera dúvidas a respeito de quais "operadores de implicação" os engenheiros de software devem considerar para implementar um Sistema Baseado em Regras Fuzzy (SBRF). Uma escolha ruim destes operadores pode implicar em SBRF's com menor acurácia e menos apropriados aos seus domínios de aplicação. Uma forma de contornar esta situação e conhecer melhor os conectivos lógicos fuzzy. Para isso se faz necessário saber quais propriedades tais conectivos podem satisfazer. Portanto, a m de corroborar com o significado de implicação fuzzy e corroborar com a implementação de SBRF's mais apropriados, várias leis Booleanas têm sido generalizadas e estudadas como equações ou inequações nas lógicas fuzzy. Tais generalizações são chamadas de leis Boolean-like e elas não são comumente válidas em qualquer semântica fuzzy. Neste cenário, esta dissertação apresenta uma investigação sobre as condições suficientes e necessárias nas quais três leis Booleanlike like — y ≤ I(x, y), I(x, I(y, x)) = 1 e I(x, I(y, z)) = I(I(x, y), I(x, z)) — se mantém válidas no contexto fuzzy, considerando seis classes de implicações fuzzy e implicações geradas por automorfismos. Além disso, ainda no intuito de implementar SBRF's mais apropriados, propomos uma extensão para os mesmos

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work discusses the application of techniques of ensembles in multimodal recognition systems development in revocable biometrics. Biometric systems are the future identification techniques and user access control and a proof of this is the constant increases of such systems in current society. However, there is still much advancement to be developed, mainly with regard to the accuracy, security and processing time of such systems. In the search for developing more efficient techniques, the multimodal systems and the use of revocable biometrics are promising, and can model many of the problems involved in traditional biometric recognition. A multimodal system is characterized by combining different techniques of biometric security and overcome many limitations, how: failures in the extraction or processing the dataset. Among the various possibilities to develop a multimodal system, the use of ensembles is a subject quite promising, motivated by performance and flexibility that they are demonstrating over the years, in its many applications. Givin emphasis in relation to safety, one of the biggest problems found is that the biometrics is permanently related with the user and the fact of cannot be changed if compromised. However, this problem has been solved by techniques known as revocable biometrics, which consists of applying a transformation on the biometric data in order to protect the unique characteristics, making its cancellation and replacement. In order to contribute to this important subject, this work compares the performance of individual classifiers methods, as well as the set of classifiers, in the context of the original data and the biometric space transformed by different functions. Another factor to be highlighted is the use of Genetic Algorithms (GA) in different parts of the systems, seeking to further maximize their eficiency. One of the motivations of this development is to evaluate the gain that maximized ensembles systems by different GA can bring to the data in the transformed space. Another relevant factor is to generate revocable systems even more eficient by combining two or more functions of transformations, demonstrating that is possible to extract information of a similar standard through applying different transformation functions. With all this, it is clear the importance of revocable biometrics, ensembles and GA in the development of more eficient biometric systems, something that is increasingly important in the present day

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces a new variant of the Traveling Car Renter Problem, named Prizecollecting Traveling Car Renter Problem. In this problem, a set of vertices, each associated with a bonus, and a set of vehicles are given. The objective is to determine a cycle that visits some vertices collecting, at least, a pre-defined bonus, and minimizing the cost of the tour that can be traveled with different vehicles. A mathematical formulation is presented and implemented in a solver to produce results for sixty-two instances. The proposed problem is also subject of an experimental study based on the algorithmic application of four metaheuristics representing the best adaptations of the state of the art of the heuristic programming.We also provide new local search operators which exploit the neighborhoods of the problem, construction procedures and adjustments, created specifically for the addressed problem. Comparative computational experiments and performance tests are performed on a sample of 80 instances, aiming to offer a competitive algorithm to the problem. We conclude that memetic algorithms, computational transgenetic and a hybrid evolutive algorithm are competitive in tests performed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O objetivo foi descrever as experiências de famílias sobre imunização de crianças menores de dois anos. É estudo de natureza descritiva, com análise qualitativa dos dados, entrevistas não estruturadas com 22 sujeitos. Os resultados foram agrupados em: conhecimentos práticos sobre imunização, responsabilidade e obrigatoriedade na imunização e ampliação da prática de imunização. Foram destacados elementos que fortalecem a imunização: experiência e realização pessoal no papel de ser mãe, temor de adoecimento, reconhecimento como um bom cuidado, acesso, flexibilidade do horário, divulgação, cartão de vacinas, campanhas de vacinação e disponibilidade de vacinas, e elementos da não imunização: inexperiência dos pais, recusa de aplicações simultâneas de vacinas, assistência fragmentada, ausência de diálogo, discriminação, falsas contraindicações e obrigatoriedade. A imunização centrada no cumprimento do calendário vacinal, ou em situações autoritárias, está descolada do cuidado familiar. O vínculo com as famílias precisa ser fortalecido para ampliação da adesão às medidas de proteção e promoção da saúde da criança.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The gravity inversion method is a mathematic process that can be used to estimate the basement relief of a sedimentary basin. However, the inverse problem in potential-field methods has neither a unique nor a stable solution, so additional information (other than gravity measurements) must be supplied by the interpreter to transform this problem into a well-posed one. This dissertation presents the application of a gravity inversion method to estimate the basement relief of the onshore Potiguar Basin. The density contrast between sediments and basament is assumed to be known and constant. The proposed methodology consists of discretizing the sedimentary layer into a grid of rectangular juxtaposed prisms whose thicknesses correspond to the depth to basement which is the parameter to be estimated. To stabilize the inversion I introduce constraints in accordance with the known geologic information. The method minimizes an objective function of the model that requires not only the model to be smooth and close to the seismic-derived model, which is used as a reference model, but also to honor well-log constraints. The latter are introduced through the use of logarithmic barrier terms in the objective function. The inversion process was applied in order to simulate different phases during the exploration development of a basin. The methodology consisted in applying the gravity inversion in distinct scenarios: the first one used only gravity data and a plain reference model; the second scenario was divided in two cases, we incorporated either borehole logs information or seismic model into the process. Finally I incorporated the basement depth generated by seismic interpretation into the inversion as a reference model and imposed depth constraint from boreholes using the primal logarithmic barrier method. As a result, the estimation of the basement relief in every scenario has satisfactorily reproduced the basin framework, and the incorporation of the constraints led to improve depth basement definition. The joint use of surface gravity data, seismic imaging and borehole logging information makes the process more robust and allows an improvement in the estimate, providing a result closer to the actual basement relief. In addition, I would like to remark that the result obtained in the first scenario already has provided a very coherent basement relief when compared to the known basin framework. This is significant information, when comparing the differences in the costs and environment impact related to gravimetric and seismic surveys and also the well drillings

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A abordagem tradicional de acidentes pressupõe que a obediência a procedimentos e normas protege o sistema contra acidentes e que esses eventos decorrem de comportamentos faltosos dos trabalhadores, originados, em parte, de aspectos de suas personalidades. A identificação desses comportamentos baseia-se em comparação com o padrão que toma por base o jeito seguro de fazer, conhecido por antecipação pelos especialistas em segurança. Nas últimas décadas, surgem visões alternativas à abordagem tradicional, ampliando o perímetro das análises de acidentes e abrindo caminho para questionamentos de seus pressupostos relativos às concepções de ser humano e de trabalho. Os novos enfoques ajudam a evidenciar os resultados estéreis das práticas tradicionais: culpar e punir as vítimas, recomendar treinamentos e normas mantendo inalterados os sistemas em que ocorreram os acidentes. As novas abordagens sugerem o esgotamento do enfoque tradicional e ressaltam a importância da contribuição dos operadores para a segurança dos sistemas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A atividade física pode ser efetivar tanto na atenção primária quanto secundária e terciária da saúde. Os objetivos do artigo são analisar a associação entre atividade física e prevenção ou tratamento das doenças crônicas não-transmissíveis e incapacidade funcional e rever os principais mecanismos biológicos responsáveis por essa associação e as recomendações atuais para a prática de exercícios nessas situações. Diversos estudos epidemiológicos mostram associação entre aumento dos níveis de atividade física e redução da mortalidade geral e por doenças cardiovasculares em indivíduos adultos e idosos. Embora ainda não estejam totalmente compreendidos, os mecanismos que ligam a atividade física à prevenção e ao tratamento de doenças e incapacidade funcional envolvem principalmente a redução da adiposidade corporal, a queda da pressão arterial, a melhora do perfil lipídico e da sensibilidade à insulina, o aumento do gasto energético, da massa e força muscular, da capacidade cardiorrespiratória, da flexibilidade e do equilíbrio. No entanto, a quantidade e qualidade dos exercícios necessários para a prevenção de agravos à saúde podem ser diferentes daquelas para melhorar o condicionamento físico. de forma geral, os consensos para a prática de exercícios preventivos ou terapêuticos contemplam atividades aeróbias e resistidas, preferencialmente somadas às atividades físicas do cotidiano. Particularmente para os idosos ou adultos, com co-morbidades ou limitações que afetem a capacidade de realizar atividades físicas, os consensos preconizam, além dessas atividades, a inclusão de exercícios para o desenvolvimento da flexibilidade e do equilíbrio.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Com o objetivo de avaliar a eficiência da prótese de poliuretano de mamona como substituto parcial do tendão calcâneo comum, foram utilizadas 30 coelhas da raça Nova Zelândia, entre dois e três meses de idade e peso médio de 2kg. Após anestesia geral, o procedimento cirúrgico em ambos os membros constou de incisão caudo-lateral no sentido longitudinal do terço médio ao distal da tíbia e exposição do tendão calcâneo comum. Após a tenectomia do tendão do músculo gastrocnêmio, a prótese de poliuretano de cerca de 0,5cm de extensão por 0,5cm de diâmetro foi fixada aos cotos proximal e distal do tendão, empregando-se o fio de polipropileno monofilamentar 4-0, conforme técnica modificada de Kessler. A prótese de poliuretano na forma elastomérica revelou propriedades como textura e flexibilidade semelhantes à do tecido tendinoso, pode ser confeccionada na forma e no tamanho almejados e permite ser moldada, cortada e esterilizada por calor úmido. Todos os animais apoiaram os membros operados imediatamente após o retorno anestésico. Não se observaram sinais clínicos de infecção e não ocorreu deiscência de ferida. Percebeu-se aumento de volume local devido ao edema, evidente na primeira semana pós-cirúrgica, que gradualmente desapareceu . À palpação foi possível delimitar com facilidade a prótese que se conservou fixa no local e intacta. Clinicamente o poliuretano de mamona não induziu reação desfavorável que comprometesse a cicatrização tendínea, podendo ser indicado como substituto temporário de tendão.