921 resultados para subtraction solving
Resumo:
სტატიაში, ავტორების მიერ შემოთავაზებულია ტომოგრაფიულ განტოლებათა სისტემის ავტომატური დაგროვების ერთერთი ხერხი, სეისმიკის შებრუნებული ამოცანების ამოხსნისას უმცირეს კვადრატთა მეთოდის გამოყენებით. მეთოდიკა გამოყენებული იქნა ზოგიერთი მოდელური ამოცანის და ენგურის თაღოვანი კაშხალის მარცხენა სანაპიროს მეორე ჰორიზონტისათვის.
Resumo:
სტატიაში დამტკიცებულია წრიული მრავალკუთხედებისათვის შებრუნებული ამოცანის ამოხსნის ერთადერთობა ორი შემთხვევისათვის: პირველი მუდმივი სიმკვრივისა და მეორე დადებითი სიმკვრივისათვის, რომელიც არ იცვლება მიმართულების მიხედვით.
Resumo:
Background: The use of three-dimensional rotational angiography (3D-RA) to assess patients with congenital heart diseases appears to be a promising technique despite the scarce literature available. Objectives: The objective of this study was to describe our initial experience with 3D-RA and to compare its radiation dose to that of standard two-dimensional angiography (2D-SA). Methods: Between September 2011 and April 2012, 18 patients underwent simultaneous 3D-RA and 2D-SA during diagnostic cardiac catheterization. Radiation dose was assessed using the dose-area-product (DAP). Results: The median patient age and weight were 12.5 years and 47.5 Kg, respectively. The median DAP of each 3D-RA acquisition was 1093µGy.m2 and 190µGy.m2 for each 2D-SA acquisition (p<0.01). In patients weighing more than 45Kg (n=7), this difference was attenuated but still significant (1525 µGy.m2 vs.413µGy.m2, p=0.01). No difference was found between one 3D-RA and three 2D-SA (1525µGy.m2 vs.1238 µGy.m2, p = 0.575) in this population. This difference was significantly higher in patients weighing less than 45Kg (n=9) (713µGy.m2 vs.81µGy.m2, P = 0.008), even when comparing one 3D-RA with three 2D-SA (242µGy.m2, respectively, p<0.008). 3D-RA was extremely useful for the assessment of conduits of univentricular hearts, tortuous branches of the pulmonary artery, and aorta relative to 2D-SA acquisitions. Conclusions: The radiation dose of 3D-RA used in our institution was higher than those previously reported in the literature and this difference was more evident in children. This type of assessment is of paramount importance when starting to perform 3D-RA.
Resumo:
This work focuses on the modeling and numerical approximations of population balance equations (PBEs) for the simulation of different phenomena occurring in process engineering. The population balance equation (PBE) is considered to be a statement of continuity. It tracks the change in particle size distribution as particles are born, die, grow or leave a given control volume. In the population balance models the one independent variable represents the time, the other(s) are property coordinate(s), e.g., the particle volume (size) in the present case. They typically describe the temporal evolution of the number density functions and have been used to model various processes such as granulation, crystallization, polymerization, emulsion and cell dynamics. The semi-discrete high resolution schemes are proposed for solving PBEs modeling one and two-dimensional batch crystallization models. The schemes are discrete in property coordinates but continuous in time. The resulting ordinary differential equations can be solved by any standard ODE solver. To improve the numerical accuracy of the schemes a moving mesh technique is introduced in both one and two-dimensional cases ...
Resumo:
An appropriate assessment of end-to-end network performance presumes highly efficient time tracking and measurement with precise time control of the stopping and resuming of program operation. In this paper, a novel approach to solving the problems of highly efficient and precise time measurements on PC-platforms and on ARM-architectures is proposed. A new unified High Performance Timer and a corresponding software library offer a unified interface to the known time counters and automatically identify the fastest and most reliable time source, available in the user space of a computing system. The research is focused on developing an approach of unified time acquisition from the PC hardware and accordingly substituting the common way of getting the time value through Linux system calls. The presented approach provides a much faster means of obtaining the time values with a nanosecond precision than by using conventional means. Moreover, it is capable of handling the sequential time value, precise sleep functions and process resuming. This ability means the reduction of wasting computer resources during the execution of a sleeping process from 100% (busy-wait) to 1-1.5%, whereas the benefits of very accurate process resuming times on long waits are maintained.
Resumo:
Consabido que para uma sociedade organizada se desenvolver política e juridicamente, indispensável se faz a existência de um documento formal, dotado de observância obrigatória, capaz de definir as competências públicas e delimitar os poderes do Estado, resguardando os direitos fundamentais de eventuais abusos dos entes políticos. Este documento é a Constituição, que, em todos os momentos da história, sempre se fez presente nos Estados, mas, inicialmente, não de forma escrita, o que fez com que surgisse, então, o constitucionalismo, movimento que defendia a necessidade de elaboração de constituições escritas, munidas de normatividade e supremacia em relação às demais espécies normativas, que visassem organizar a separação dos poderes estatais e declarar os direitos e as liberdades individuais. Porém, de nada adiantaria a edição de uma Lei Maior sem que houvesse mecanismos de defesa, no intuito de afastar qualquer ameaça à segurança jurídica e à estabilidade social, por conta de alguma lei ou ato normativo contrário aos preceitos estabelecidos na Constituição. O controle de constitucionalidade, pilar do Estado de Direito, consiste em verificar a compatibilidade entre uma lei ou qualquer ato normativo infraconstitucional e a Lei Excelsa e, em havendo contraste, a lei ou o ato viciado deverá ser expurgado do ordenamento jurídico, para que a unidade constitucional seja restabelecida. No Brasil, o controle de constitucionalidade foi instituído sob forte influência do modelo norte-americano e obteve diversos tratamentos ao longo das constituições brasileiras, porém, o sistema de fiscalização de constitucionalidade teve seu ápice com o advento da atual Constituição Federal, promulgada em 05.10.88, com a criação de instrumentos processuais inovadores destinados à verificação da constitucionalidade das leis e atos normativos. Além disso, a Carta da República de 1988, ao contrário das anteriores, fortaleceu a figura do Poder Judiciário no contexto político, conferindo, assim, maior autonomia aos magistrados na solução de casos de grande repercussão nacional, redundando em um protagonismo judicial atual. Nesse contexto, o Supremo Tribunal Federal, órgão de cúpula do Judiciário nacional e guardião da Constituição, tem se destacado no cenário nacional, em especial na defesa dos direitos e garantias fundamentais insculpidos na Lei Fundamental, fazendo-se necessária, desta forma, uma análise na jurisprudência da Corte, no sentido de verificar se, de fato, tem havido evolução no controle de constitucionalidade no Brasil ao longo dos últimos anos e, em caso afirmativo, em que circunstâncias isso tem se dado.
Resumo:
Despite the huge increase in processor and interprocessor network performace, many computational problems remain unsolved due to lack of some critical resources such as floating point sustained performance, memory bandwidth, etc... Examples of these problems are found in areas of climate research, biology, astrophysics, high energy physics (montecarlo simulations) and artificial intelligence, among others. For some of these problems, computing resources of a single supercomputing facility can be 1 or 2 orders of magnitude apart from the resources needed to solve some them. Supercomputer centers have to face an increasing demand on processing performance, with the direct consequence of an increasing number of processors and systems, resulting in a more difficult administration of HPC resources and the need for more physical space, higher electrical power consumption and improved air conditioning, among other problems. Some of the previous problems can´t be easily solved, so grid computing, intended as a technology enabling the addition and consolidation of computing power, can help in solving large scale supercomputing problems. In this document, we describe how 2 supercomputing facilities in Spain joined their resources to solve a problem of this kind. The objectives of this experience were, among others, to demonstrate that such a cooperation can enable the solution of bigger dimension problems and to measure the efficiency that could be achieved. In this document we show some preliminary results of this experience and to what extend these objectives were achieved.
Resumo:
We say the endomorphism problem is solvable for an element W in a free group F if it can be decided effectively whether, given U in F, there is an endomorphism Φ of F sending W to U. This work analyzes an approach due to C. Edmunds and improved by C. Sims. Here we prove that the approach provides an efficient algorithm for solving the endomorphism problem when W is a two- generator word. We show that when W is a two-generator word this algorithm solves the problem in time polynomial in the length of U. This result gives a polynomial-time algorithm for solving, in free groups, two-variable equations in which all the variables occur on one side of the equality and all the constants on the other side.
Resumo:
The decisions of many individuals and social groups, taking according to well-defined objectives, are causing serious social and environmental problems, in spite of following the dictates of economic rationality. There are many examples of serious problems for which there are not yet appropriate solutions, such as management of scarce natural resources including aquifer water or the distribution of space among incompatible uses. In order to solve these problems, the paper first characterizes the resources and goods involved from an economic perspective. Then, for each case, the paper notes that there is a serious divergence between individual and collective interests and, where possible, it designs the procedure for solving the conflict of interests. With this procedure, the real opportunities for the application of economic theory are shown, and especially the theory on collective goods and externalities. The limitations of conventional economic analysis are shown and the opportunity to correct the shortfalls is examined. Many environmental problems, such as climate change, have an impact on different generations that do not participate in present decisions. The paper shows that for these cases, the solutions suggested by economic theory are not valid. Furthermore, conventional methods of economic valuation (which usually help decision-makers) are unable to account for the existence of different generations and tend to obviate long-term impacts. The paper analyzes how economic valuation methods could account for the costs and benefits enjoyed by present and future generations. The paper studies an appropriate consideration of preferences for future consumption and the incorporation of sustainability as a requirement in social decisions, which implies not only more efficiency but also a fairer distribution between generations than the one implied by conventional economic analysis.
Resumo:
Let T be the Cayley graph of a finitely generated free group F. Given two vertices in T consider all the walks of a given length between these vertices that at a certain time must follow a number of predetermined steps. We give formulas for the number of such walks by expressing the problem in terms of equations in F and solving the corresponding equations.
Resumo:
We have compared three cases of payments for water-related environmental services (PES) in Central America, in terms of socioeconomic background, opportunity costs of forest conservation and stakeholders’ perceptions on the conditions of water resources and other issues. We found that, in general, the foregone benefits from land uses alternative to forest cover are larger than the amount paid, which apparently contradicts the economic foundation of PES schemes. A number of possible explanations are explored. The results also suggest that trade-offs between different environmental and social goals are likely to emerge in PES schemes, posing some doubts on their ability to be multipurpose instruments for environmental improvement and rural development. We also found that PES schemes may work as a conflictresolution instrument, facilitating downstream -upstream problem solving, though at the same time they might introduce changes in social perceptions of property rights.
Resumo:
Estudi elaborat a partir d’una estada a l'Imperial College of London, Gran Bretanya, entre setembre i desembre 2006. Disposar d'una geometria bona i ben definida és essencial per a poder resoldre eficientment molts dels models computacionals i poder obtenir uns resultats comparables a la realitat del problema. La reconstrucció d'imatges mèdiques permet transformar les imatges obtingudes amb tècniques de captació a geometries en formats de dades numèriques . En aquest text s'explica de forma qualitativa les diverses etapes que formen el procés de reconstrucció d'imatges mèdiques fins a finalment obtenir una malla triangular per a poder‐la processar en els algoritmes de càlcul. Aquest procés s'inicia a l'escàner MRI de The Royal Brompton Hospital de Londres del que s'obtenen imatges per a després poder‐les processar amb les eines CONGEN10 i SURFGEN per a un entorn MATLAB. Aquestes eines les han desenvolupat investigadors del Bioflow group del departament d'enginyeria aeronàutica del Imperial College of London i en l'ultim apartat del text es comenta un exemple d'una artèria que entra com a imatge mèdica i surt com a malla triangular processable amb qualsevol programari o algoritme que treballi amb malles.
Resumo:
Counter automata are more powerful versions of finite state automata where addition and subtraction operations are permitted on a set of n integer registers, called counters. We show that the word problem of Zn is accepted by a nondeterministic m-counter automaton if and only if m &= n.
Resumo:
Aeromonas hydrophila és un bacil gram-negatiu, patogen oportunista d’animal i humans. La patogènesi d’A. Hydrophila és multifactorial. A fi d'identificar gens implicats en la virulència de la soca PPD134/91 d’A. hydrophila, vam realitzar experiments de substracció gènica, que van dur a la detecció de 22 fragments d’ADN que codificaven 19 potencials factors de virulencia, incloent un gen que codificava una proteïna de sistema de secreció de tipus III (T3SS). La importància creixent del T3SS en la patogènesi de diversos bacteris, ens va dur a identificar i analitzar l'agrupació gènica del T3SS de les soques AH-1 i AH-3 d’A. hydrophila. La inactivació dels gens de T3SS aopB i aopD d’A. hydrophila AH-1, i ascV d’A. hydrophila AH-3, comporta una disminució de la citotoxicitat, un increment de la fagocitosi, i una reducció de la virulència en diferents models animals. Aquests resultats demostren que el T3SS és necessari per a la patogenicitat. També vam clonar i seqüenciar una ADP-ribosiltransferasa (AexT) a la soca AH-3 d’A. hydrophila, i vam demostrar que aquesta toxina és translocada via el T3SS, sistema que al seu torn sembla ser induïble in vitro en condicions de depleció de calci. El mutant en el gen aexT de la soca AH-3 d’A. hydrophila va mostrar una lleugera reducció de la virulència, assajada amb diferents mètodes. Mitjançant l'ús de diferents sondes d’ADN, vam determinar la presència del T3SS en soques tant clíniques com ambientals de diferents espècies del gènere Aeromonas: A. hydrophila, A. veronii, i A. caviae, i la codistribució d'aquesta agrupació gènica i el gen aexT. Finalment, amb la finalitat d'estudiar la regulació transcripcional de l'agrupació gènica de T3SS i de l’efector AexT A. hydrophila AH-3, vam aïllar els promotors predits per l’operó aopN-aopD i el gen aexT, i els vam fusionar amb el gen reporter gfp (Green Fluorescence Protein). A més, vam demostrar que l'expressió d'ambdós promotors depèn de diferents components bacterians, com per exemple el sistema de dos components PhoP/PhoQ, el sistema de quorum sensing AhyI/AhyR, o el complex piruvat deshidrogenasa.
Resumo:
Aquest article se centra en les implicacions de la difusió electrònica per al sistema de publicació de revistes basat en la revisió per parells [peer-reviewed]. Per donar sentit a un assumpte tan complex, és de molt ajut mirar-s'ho des de la perspectiva dels orígens del sistema i de les seves tres funcions nuclears: el rànquing en la recerca, facilitar la comunicació interactiva entre els estudiosos i crear un arxiu global del coneixement científic. Cadascuna d’aquestes funcions principals té requeriments diferents que, en certa mesura, se sobreposen però que també entren, d'alguna manera, en conflicte. Internet obre la possibilitat de desenvolupar una varietat de models distints de comunicació científica modulant la intensitat de cadascun d'aquests tres rols que les revistes en paper han desenvolupat i, possiblement, d'altres funcions que no eren ni tan sols imaginables abans del desenvolupament de les xarxes electròniques d'abast planetari. Les implicacions de la distribució electrònica per a la propietat i accés a la literatura científica són profundes i tendeixen a agreujar la ja seriosa crisi dels preus de les revistes que està frenant l'accés a la informació científica. La comunitat d'estudiosos, que és autora del material que aquestes publicacions contenen i, al mateix temps, n'és el principal consumidor, està en possessió de la clau per a solucionar aquesta crisi tot permetent a Internet ser un vehicle que faciliti la difusió d’una recerca finançada des del sector públic en comptes de crear una situació de propietat privada d'aquesta recerca.