550 resultados para execute
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-07
Resumo:
Introduction. Subfascial Endoscopic Perforator Surgery (SEPS) enables the direct visualization and section of perforating veins. Morbidity and duration of hospitalization are both less than with conventional open surgery (Linton’s or Felder’s techniques). Patients and methods. A total of 322 legs from 285 patients with a mean age of 56 years (range 23-90) were treated at our Department from May 1996 to January 2010. In 309 cases, an endoscope (ETM Endoskopische Technik GmbH, Berlin, Germany) was introduced through a transverse incision approximately 1.5 cm in length and 10 cm from the tibial tuberosity, as with Linton’s technique. A spacemaker balloon dissector for SEPS, involving a second incision 6 cm from the first, was used in only 13 cases. Results. The procedure used in each case was decided on the basis of preoperative evaluation. SEPS and stripping were performed in 238 limbs (73.91%), SEPS and short stripping in 7 limbs (2.17%), SEPS and crossectomy in 51 limbs (15.84%), and SEPS alone in 26 limbs (8.07%). 103 patients presented a total of 158 trophic ulcers; the healing time was between 1 and 3 months, with a healing rate of 82.91% after 1 month and 98.73% after 3 months. Conclusion. Subfascial ligature of perforating veins is superior to sclerotherapy and minimally invasive suprafascial treatment for the treatment of CVI. It is easy to execute, minimally invasive and has few complications.
Resumo:
MATOS FILHO, João. A descentralização das Políticas de desenvolvimento rural - uma análise da experiência do Rio Grande do Norte. 2002. 259f. Tese (Doutorado em Ciências Econômicas)– Instituto de Economia da Universidade Estadual de Campinas, Campinas, 2002.
Resumo:
In contrast to animals and lower plant species, sperm cells of flowering plants are non-motile and are transported to the female gametes via the pollen tube, i.e. the male gametophyte. Upon arrival at the female gametophyte two sperm cells are discharged into the receptive synergid cell to execute double fertilization. The first players involved in inter-gametophyte signaling to attract pollen tubes and to arrest their growth have been recently identified. In contrast the physiological mechanisms leading to pollen tube burst and thus sperm discharge remained elusive. Here, we describe the role of polymorphic defensin-like cysteine-rich proteins ZmES1-4 (Zea mays embryo sac) from maize, leading to pollen tube growth arrest, burst, and explosive sperm release. ZmES1-4 genes are exclusively expressed in the cells of the female gametophyte. ZmES4-GFP fusion proteins accumulate in vesicles at the secretory zone of mature synergid cells and are released during the fertilization process. Using RNAi knock-down and synthetic ZmES4 proteins, we found that ZmES4 induces pollen tube burst in a species-preferential manner. Pollen tube plasma membrane depolarization, which occurs immediately after ZmES4 application, as well as channel blocker experiments point to a role of K(+)-influx in the pollen tube rupture mechanism. Finally, we discovered the intrinsic rectifying K(+) channel KZM1 as a direct target of ZmES4. Following ZmES4 application, KZM1 opens at physiological membrane potentials and closes after wash-out. In conclusion, we suggest that vesicles containing ZmES4 are released from the synergid cells upon male-female gametophyte signaling. Subsequent interaction between ZmES4 and KZM1 results in channel opening and K(+) influx. We further suggest that K(+) influx leads to water uptake and culminates in osmotic tube burst. The species-preferential activity of polymorphic ZmES4 indicates that the mechanism described represents a pre-zygotic hybridization barrier and may be a component of reproductive isolation in plants.
Resumo:
Elasticity is one of the most known capabilities related to cloud computing, being largely deployed reactively using thresholds. In this way, maximum and minimum limits are used to drive resource allocation and deallocation actions, leading to the following problem statements: How can cloud users set the threshold values to enable elasticity in their cloud applications? And what is the impact of the applications load pattern in the elasticity? This article tries to answer these questions for iterative high performance computing applications, showing the impact of both thresholds and load patterns on application performance and resource consumption. To accomplish this, we developed a reactive and PaaS-based elasticity model called AutoElastic and employed it over a private cloud to execute a numerical integration application. Here, we are presenting an analysis of best practices and possible optimizations regarding the elasticity and HPC pair. Considering the results, we observed that the maximum threshold influences the application time more than the minimum one. We concluded that threshold values close to 100% of CPU load are directly related to a weaker reactivity, postponing resource reconfiguration when its activation in advance could be pertinent for reducing the application runtime.
Resumo:
Este trabalho propõe um estudo de sinais cerebrais aplicados em sistemas BCI (Brain-Computer Interface - Interfaces Cérebro Computador), através do uso de Árvores de Decisão e da análise dessas árvores com base nas Neurociências. Para realizar o tratamento dos dados são necessárias 5 fases: aquisição de dados, pré-processamento, extração de características, classificação e validação. Neste trabalho, todas as fases são contempladas. Contudo, enfatiza-se as fases de classificação e de validação. Na classificação utiliza-se a técnica de Inteligência Artificial denominada Árvores de Decisão. Essa técnica é reconhecida na literatura como uma das formas mais simples e bem sucedidas de algoritmos de aprendizagem. Já a fase de validação é realizada nos estudos baseados na Neurociência, que é um conjunto das disciplinas que estudam o sistema nervoso, sua estrutura, seu desenvolvimento, funcionamento, evolução, relação com o comportamento e a mente, e também suas alterações. Os resultados obtidos neste trabalho são promissores, mesmo sendo iniciais, visto que podem melhor explicar, com a utilização de uma forma automática, alguns processos cerebrais.
Resumo:
Symbolic execution is a powerful program analysis technique, but it is very challenging to apply to programs built using event-driven frameworks, such as Android. The main reason is that the framework code itself is too complex to symbolically execute. The standard solution is to manually create a framework model that is simpler and more amenable to symbolic execution. However, developing and maintaining such a model by hand is difficult and error-prone. We claim that we can leverage program synthesis to introduce a high-degree of automation to the process of framework modeling. To support this thesis, we present three pieces of work. First, we introduced SymDroid, a symbolic executor for Android. While Android apps are written in Java, they are compiled to Dalvik bytecode format. Instead of analyzing an app’s Java source, which may not be available, or decompiling from Dalvik back to Java, which requires significant engineering effort and introduces yet another source of potential bugs in an analysis, SymDroid works directly on Dalvik bytecode. Second, we introduced Pasket, a new system that takes a first step toward automatically generating Java framework models to support symbolic execution. Pasket takes as input the framework API and tutorial programs that exercise the framework. From these artifacts and Pasket's internal knowledge of design patterns, Pasket synthesizes an executable framework model by instantiating design patterns, such that the behavior of a synthesized model on the tutorial programs matches that of the original framework. Lastly, in order to scale program synthesis to framework models, we devised adaptive concretization, a novel program synthesis algorithm that combines the best of the two major synthesis strategies: symbolic search, i.e., using SAT or SMT solvers, and explicit search, e.g., stochastic enumeration of possible solutions. Adaptive concretization parallelizes multiple sub-synthesis problems by partially concretizing highly influential unknowns in the original synthesis problem. Thanks to adaptive concretization, Pasket can generate a large-scale model, e.g., thousands lines of code. In addition, we have used an Android model synthesized by Pasket and found that the model is sufficient to allow SymDroid to execute a range of apps.
Resumo:
A correct understanding about how computers run code is mandatory in order to effectively learn to program. Lectures have historically been used in programming courses to teach how computers execute code, and students are assessed through traditional evaluation methods, such as exams. Constructivism learning theory objects to students passiveness during lessons, and traditional quantitative methods for evaluating a complex cognitive process such as understanding. Constructivism proposes complimentary techniques, such as conceptual contraposition and colloquies. We enriched lectures of a Programming II (CS2) course combining conceptual contraposition with program memory tracing, then we evaluated students understanding of programming concepts through colloquies. Results revealed that these techniques applied to the lecture are insufficient to help students develop satisfactory mental models of the C++ notional machine, and colloquies behaved as the most comprehensive traditional evaluations conducted in the course.
Resumo:
Research points to a gap between academic or disciplinary based geography and what is taught in secondary classes across the nation. This study documents a teacher’s journey and efforts to bring a more disciplinary approach to two suburban heterogeneous sixth grade geography classrooms. The researcher traces student perspectives on geography and facility with geographic reasoning as well as his own perspectives and pedagogy with respect to student data. The study attempts to map the space where school geography meets and interacts with disciplinary oriented geography based upon the Geography for Life National Geography Standards. Participants completed two sets of baseline assessments and two sets of end of year assessments as well as an initial intake survey. The seven primary participants were interviewed five times each throughout the academic school year and data were openly coded. The data suggest that students can learn geography and geographic reasoning from a disciplinary perspective. Students sharpened their geographic skills through deeper subject matter knowledge and developing spatial and ecological perspectives. The data also indicate that the teacher researcher faced considerable challenges in implementing a disciplinary approach to teaching geography. The coverage demands of a crowded history-centric curriculum together with ill-fitting resources required a labor-intensive effort to put together and execute this study. Study findings indicate that the path to good geography pedagogy can be impeded by a host of external and internal challenges. However, to forward thinking practitioners, the effort to straddle the gap between school geography and disciplinary-based geography may be well worth it.
Resumo:
Tämän diplomityön tarkoituksena oli tutkia pk-yrityksen toimistoympäristössä tapahtuvaa prosessia ja tehostaa sen toimintaa lean-työkaluja käyttämällä. Prosessin hukkien tunnistamiseksi kohdeyritykseen toteutettiin arvovirtakartoitus jonka tuloksena tunnistettiin prosessien eri vaiheiden hukat ja kohdistettiin niihin ehkäisevät lean-toimenpiteet. Tutkimus toteutettiin kvalitatiivisena toimintatutkimuksena, joka sisältää kirjallisuusosion sekä empiirisen osion kohdeyritykseen suoritetun arvovirtakartoituksen muodossa. Kirjallisen osion teoria on kerätty tieteellisestä kirjallisuudesta ja artikkeleista, artikkelit koostuvat suurelta osin internet-lähteistä. Työn empiirinen osio ja sen tiedot on kerätty tutustumalla case-yrityksen toimistoprosessiin sekä suorittamalla lukuisia haastatteluja yrityksen henkilöstön kanssa. Eniten hukkaa aiheutui dokumentoinnin ja työkalujen puutteesta sekä sisäisestä yhteistyökyvyttömyydestä. Työn tulokset tukevat lean-työkalujen käyttämistä, sillä suurin osa esiintyvästä hukasta on mahdollista ehkäistä helposti työn paremmalla dokumentoinnilla ja päivittäisiä työtapoja muokkaamalla. Sopivimmat lean-työkalut olivat standardoitu työ, kaizen sekä gemba.
Resumo:
Human operators are unique in their decision making capability, judgment and nondeterminism. Their sense of judgment, unpredictable decision procedures, susceptibility to environmental elements can cause them to erroneously execute a given task description to operate a computer system. Usually, a computer system is protected against some erroneous human behaviors by having necessary safeguard mechanisms in place. But some erroneous human operator behaviors can lead to severe or even fatal consequences especially in safety critical systems. A generalized methodology that can allow modeling and analyzing the interactions between computer systems and human operators where the operators are allowed to deviate from their prescribed behaviors will provide a formal understanding of the robustness of a computer system against possible aberrant behaviors by its human operators. We provide several methodology for assisting in modeling and analyzing human behaviors exhibited while operating computer systems. Every human operator is usually given a specific recommended set of guidelines for operating a system. We first present process algebraic methodology for modeling and verifying recommended human task execution behavior. We present how one can perform runtime monitoring of a computer system being operated by a human operator for checking violation of temporal safety properties. We consider the concept of a protection envelope giving a wider class of behaviors than those strictly prescribed by a human task that can be tolerated by a system. We then provide a framework for determining whether a computer system can maintain its guarantees if the human operators operate within their protection envelopes. This framework also helps to determine the robustness of the computer system under weakening of the protection envelopes. In this regard, we present a tool called Tutela that assists in implementing the framework. We then examine the ability of a system to remain safe under broad classes of variations of the prescribed human task. We develop a framework for addressing two issues. The first issue is: given a human task specification and a protection envelope, will the protection envelope properties still hold under standard erroneous executions of that task by the human operators? In other words how robust is the protection envelope? The second issue is: in the absence of a protection envelope, can we approximate a protection envelope encompassing those standard erroneous human behaviors that can be safely endured by the system? We present an extension of Tutela that implements this framework. The two frameworks mentioned above use Concurrent Game Structures (CGS) as models for both computer systems and their human operators. However, there are some shortcomings of this formalism for our uses. We add incomplete information concepts in CGSs to achieve better modularity for the players. We introduce nondeterminism in both the transition system and strategies of players and in the modeling of human operators and computer systems. Nondeterministic action strategies for players in \emph{i}ncomplete information \emph{N}ondeterministic CGS (iNCGS) is a more precise formalism for modeling human behaviors exhibited while operating a computer system. We show how we can reason about a human behavior satisfying a guarantee by providing a semantics of Alternating Time Temporal Logic based on iNCGS player strategies. In a nutshell this dissertation provides formal methodology for modeling and analyzing system robustness against both expected and erroneous human operator behaviors.
Resumo:
Cold smoking method is one of the commonest ways for fish smoking. It is done by the smoke that is the result of burning hard and soft woods is smoking rooms. Smoke includes a number of chemical constructs and its main part is poly aromatic hydrocarbons. More than one hundred kinds of these constructs are recognized in smoke that is produced from saturated hydrocarbons resulted from the solution of the woods Ligno cellulose in high temperature and lack of oxygen conditions. The high poisoning potentials and carcinogenic features sixteen constructs among them are proved and observed on humans. In this research, the PAH compounds were identified and observed in a three month period after smoking and during storing among three types of smoked fishes Silver carp and Caspian sea Sefid and herring. They are the most produced and consumed smoked fish in Iran. To find the relationship between the concentrations of PAH constructs and the amount of lipid in fish, first, the amount of lipid were determined separately in the skin and flesh of 30 samples of each type. The method used was Bligh and Dyer (1959). PAH compounds derivation were made for all skin and flesh samples smoked fish using organic solvents with Soxeleh and the derived samples were injected to gas chromatography (GC) by Hamilton injectors for determining their components quality and their quantity. The height of the used column was 25 meters and its diameter was 0.32 mm with the silica filler, nitrogen gas as carrier and flame ionization detector (FID) that are special for these constructs. For data analysis, Statistical tests were used by computer soft ware identified that the difference in the amount of lipid within the flesh and skin of each species and also among each other is significant. The largest amount was in Herrings flesh and skin, 18.74% in skin and 14.47% in flesh. The least amount in the skin 4.19% and the flesh 3.10% of Sefid. The amount in Silver carp was 13.28%in skin and 8.16% in flesh. The examination of the PAH compounds in smoked fish showed that is carcinogenic compounds; exist in these in these fish with different quantities in each. It seems that its amount is directly related to the amount of their lipid. The amount is different in flesh and skin. One of the most important reasons is the direct content of smoke and the concentration of lipid in tissues of all three types. The maintenance of the smoked fish for three months showed that most of PAH compounds were solved and their density decreased. The changes in density within time in different in each type and in flesh and skin. The amount of their receiving in human through the consumption of the smoked fish depends on the resulted density, the way and the amount of consumption and now we can determine and execute standards for the maximum dosage per day and per month regarding effective factors.
Resumo:
The stirring of a body of viscous fluid using multiple stirring rods is known to be particularly effective when the rods trace out a path corresponding to a nontrivial mathematical braid. The optimal braid is the so-called "pigtail braid", in which three stirring rods execute the usual "over-under" motion associated with braiding plaiting) hair. We show how to achieve this optimal braiding motion straightforwardly: one stirring rod is driven in a figure-of-eight motion, while the other two rods are baffles, which rotate episodically about their common centre. We also explore the extent to which the physical baffles may be replaced by flow structures (such as periodic islands).
Resumo:
Database schemas, in many organizations, are considered one of the critical assets to be protected. From database schemas, it is not only possible to infer the information being collected but also the way organizations manage their businesses and/or activities. One of the ways to disclose database schemas is through the Create, Read, Update and Delete (CRUD) expressions. In fact, their use can follow strict security rules or be unregulated by malicious users. In the first case, users are required to master database schemas. This can be critical when applications that access the database directly, which we call database interface applications (DIA), are developed by third party organizations via outsourcing. In the second case, users can disclose partially or totally database schemas following malicious algorithms based on CRUD expressions. To overcome this vulnerability, we propose a new technique where CRUD expressions cannot be directly manipulated by DIAs any more. Whenever a DIA starts-up, the associated database server generates a random codified token for each CRUD expression and sends it to the DIA that the database servers can use to execute the correspondent CRUD expression. In order to validate our proposal, we present a conceptual architectural model and a proof of concept.