939 resultados para Catalan language -- To 1500 -- Clitics
Resumo:
Secure Multi-party Computation (MPC) enables a set of parties to collaboratively compute, using cryptographic protocols, a function over their private data in a way that the participants do not see each other's data, they only see the final output. Typical MPC examples include statistical computations over joint private data, private set intersection, and auctions. While these applications are examples of monolithic MPC, richer MPC applications move between "normal" (i.e., per-party local) and "secure" (i.e., joint, multi-party secure) modes repeatedly, resulting overall in mixed-mode computations. For example, we might use MPC to implement the role of the dealer in a game of mental poker -- the game will be divided into rounds of local decision-making (e.g. bidding) and joint interaction (e.g. dealing). Mixed-mode computations are also used to improve performance over monolithic secure computations. Starting with the Fairplay project, several MPC frameworks have been proposed in the last decade to help programmers write MPC applications in a high-level language, while the toolchain manages the low-level details. However, these frameworks are either not expressive enough to allow writing mixed-mode applications or lack formal specification, and reasoning capabilities, thereby diminishing the parties' trust in such tools, and the programs written using them. Furthermore, none of the frameworks provides a verified toolchain to run the MPC programs, leaving the potential of security holes that can compromise the privacy of parties' data. This dissertation presents language-based techniques to make MPC more practical and trustworthy. First, it presents the design and implementation of a new MPC Domain Specific Language, called Wysteria, for writing rich mixed-mode MPC applications. Wysteria provides several benefits over previous languages, including a conceptual single thread of control, generic support for more than two parties, high-level abstractions for secret shares, and a fully formalized type system and operational semantics. Using Wysteria, we have implemented several MPC applications, including, for the first time, a card dealing application. The dissertation next presents Wys*, an embedding of Wysteria in F*, a full-featured verification oriented programming language. Wys* improves on Wysteria along three lines: (a) It enables programmers to formally verify the correctness and security properties of their programs. As far as we know, Wys* is the first language to provide verification capabilities for MPC programs. (b) It provides a partially verified toolchain to run MPC programs, and finally (c) It enables the MPC programs to use, with no extra effort, standard language constructs from the host language F*, thereby making it more usable and scalable. Finally, the dissertation develops static analyses that help optimize monolithic MPC programs into mixed-mode MPC programs, while providing similar privacy guarantees as the monolithic versions.
Resumo:
Research on the socio-political dimensions of language diversity in mathematics classrooms is under-theorised and largely focuses on language choice. These dimensions are, however, likely to influence mathematics classroom interaction in many other ways than participants’ choice of language. To investigate these influences, I propose that the notions ofheteroglossia, orders of indexicality and scale-jumping, can provide new theoretical tools with which to understand the links between classroom interaction and broader social patterns of marginalisation. To illustrate the utility of these ideas, I include some analysis of an episode observed in a sheltered elementary school second language mathematics classroom in Canada.
Resumo:
The development of oil wells drilling requires additional cares mainly if the drilling is in offshore ultra deep water with low overburden pressure gradients which cause low fracture gradients and, consequently, difficult the well drilling by the reduction of the operational window. To minimize, in the well planning phases, the difficulties faced by the drilling in those sceneries, indirect models are used to estimate fracture gradient that foresees approximate values for leakoff tests. These models generate curves of geopressures that allow detailed analysis of the pressure behavior for the whole well. Most of these models are based on the Terzaghi equation, just differentiating in the determination of the values of rock tension coefficient. This work proposes an alternative method for prediction of fracture pressure gradient based on a geometric correlation that relates the pressure gradients proportionally for a given depth and extrapolates it for the whole well depth, meaning that theses parameters vary in a fixed proportion. The model is based on the application of analytical proportion segments corresponding to the differential pressure related to the rock tension. The study shows that the proposed analytical proportion segments reaches values of fracture gradient with good agreement with those available for leakoff tests in the field area. The obtained results were compared with twelve different indirect models for fracture pressure gradient prediction based on the compacting effect. For this, a software was developed using Matlab language. The comparison was also made varying the water depth from zero (onshore wellbores) to 1500 meters. The leakoff tests are also used to compare the different methods including the one proposed in this work. The presented work gives good results for error analysis compared to other methods and, due to its simplicity, justify its possible application
Resumo:
The present work aims to allow developers to implement small features on a certain Android application in a fast and easy manner, as well as provide their users to install them ondemand, i.e., they can install the ones they are interested in. These small packages of features are called plugins, and the chosen development language to develop these in was JavaScript. In order to achieve that, an Android framework was developed that enables the host application to install, manage and run these plugins at runtime. This framework was designed to have a very clean and almost readable API, which allowed for better code organization and maintainability. The implementation used the Google’s engine “V8” to interpret the JavaScript code and through a set of JNI calls made that code call certain Android methods previously registered in the runtime. In order to test the framework, it was integrated with the client’s communication application RCS+ using two plugins developed alongside the framework. Although these plugins had only the more common requirements, they were proven to work successfully as intended. Concluding, the framework although successful made it clear that this kind of development through a non-native API has its set of difficulties especially regarding the implementation of complex features.
Resumo:
Knowledge organization (KO) research is a field of scholarship concerned with the design, study and critique of the processes of organizing and representing documents that societies see as worthy of preserving (Tennis, 2008). In this context we are concerned with the relationship between language and action.On the one hand, we are concerned with what language can and does do for our knowledge organization systems (KOS). For example, how do the words NEGRO or INDIAN work in historical and contemporary indexing languages? In relation to this, we are also concerned with how we know about knowledge organization (KO) and its languages. On the other hand, we are concerned with how to act given this knowledge. That is, how do we carry out research and how do we design, implement, and evaluate KO systems?It is important to consider these questions in the context of our work because we are delegated by society to disseminate cultural memory. We are endowed with a perspective, prepared by an education, and granted positions whereby society asks us to ensure that documentary material is accessible to future generations. There is a social value in our work, and as such there is a social imperative to our work. We must act with good conscience, and use language judiciously, for the memory of the world is a heavy burden.In this paper, I explore these two weights of language and action that bear down on KO researchers. I first summarize what extant literature says about the knowledge claims we make with regard to KO practices and systems. To make it clear what it is that I think we know, I create a schematic that will link claims (language) to actions in advising, implementing, or evaluating information practices and systems.I will then contrast this with what we do not know, that is, what the unanswered questions might be (Gnoli, 2008 ; Dahlberg, 2011), and I will discuss them in relation to the two weights in our field of KO.Further, I will try to provide a systematic overview of possible ways to address these open questions in KO research. I will draw on the concept of elenchus - the forms of epistemology, theory, and methodology in KO (Tennis, 2008), and framework analysis which are structures, work practice, and discourses of KO systems (Tennis, 2006). In so doing, I will argue for a Neopragmatic stance on the weight of language and action in KO (Rorty, 1982 ; 2000). I will close by addressing the lacuna left in Neopragmatic thought – the ethical imperative to use language and action in a particular good and moral way. That is, I will address the ethical imperative of KO given its weights, epistemologies, theories, and methods. To do this, I will review a sample of relevant work on deontology in both western and eastern philosophical schools (e.g., Harvey, 1995).The perspective I want to communicate in this section is that the good in carrying out KO research may begin with epistemic stances (cf., language), but ultimately stands on ethical actions. I will present an analysis describing the micro and the macro ethical concerns in relation to KO research and its advice on practice. I hope this demonstrates that the direction of epistemology, theory, and methodology in KO, while burdened with the dual weights of language and action, is clear when provided an ethical sounding board. We know how to proceed when we understand how our work can benefit the world.KO is an important, if not always understood, division of labor in a society that values its documentary heritage and memory institutions. Being able to do good requires us to understand how to balance the weights of language and action. We must understand where we stand and be able to chart a path forward, one that does not cause harm, but adds value to the world and those that want to access recorded knowledge.
Resumo:
This Thesis is composed of a collection of works written in the period 2019-2022, whose aim is to find methodologies of Artificial Intelligence (AI) and Machine Learning to detect and classify patterns and rules in argumentative and legal texts. We define our approach “hybrid”, since we aimed at designing hybrid combinations of symbolic and sub-symbolic AI, involving both “top-down” structured knowledge and “bottom-up” data-driven knowledge. A first group of works is dedicated to the classification of argumentative patterns. Following the Waltonian model of argument and the related theory of Argumentation Schemes, these works focused on the detection of argumentative support and opposition, showing that argumentative evidences can be classified at fine-grained levels without resorting to highly engineered features. To show this, our methods involved not only traditional approaches such as TFIDF, but also some novel methods based on Tree Kernel algorithms. After the encouraging results of this first phase, we explored the use of a some emerging methodologies promoted by actors like Google, which have deeply changed NLP since 2018-19 — i.e., Transfer Learning and language models. These new methodologies markedly improved our previous results, providing us with best-performing NLP tools. Using Transfer Learning, we also performed a Sequence Labelling task to recognize the exact span of argumentative components (i.e., claims and premises), thus connecting portions of natural language to portions of arguments (i.e., to the logical-inferential dimension). The last part of our work was finally dedicated to the employment of Transfer Learning methods for the detection of rules and deontic modalities. In this case, we explored a hybrid approach which combines structured knowledge coming from two LegalXML formats (i.e., Akoma Ntoso and LegalRuleML) with sub-symbolic knowledge coming from pre-trained (and then fine-tuned) neural architectures.
Resumo:
The article presents two translation models that were typical in the XVII and XVIII centuries, but can also be seen as paradigmatics in the field of Translation Studies: 1) The rhetorical model that defends the possibility of translation and emphasizes the necessity of adapting the original to the taste of the target public. 2) The model that affirms the impossibility of translation, the non-translatability: a) because the sensual elements that are linked to the language of the original are praised; b) because there is a defense of the cultural relativism and of the non-translatability between cultures; c) or because there is a defense not only of the impossibility to separate signifiers and meaning, but also because there is a definition of the signifier and of all identities as being the result of a differential game.
Resumo:
Universidade Estadual de Campinas. Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
An updated flow pattern map was developed for CO2 on the basis of the previous Cheng-Ribatski-Wojtan-Thome CO2 flow pattern map [1,2] to extend the flow pattern map to a wider range of conditions. A new annular flow to dryout transition (A-D) and a new dryout to mist flow transition (D-M) were proposed here. In addition, a bubbly flow region which generally occurs at high mass velocities and low vapor qualities was added to the updated flow pattern map. The updated flow pattern map is applicable to a much wider range of conditions: tube diameters from 0.6 to 10 mm, mass velocities from 50 to 1500 kg/m(2) s, heat fluxes from 1.8 to 46 kW/m(2) and saturation temperatures from -28 to +25 degrees C (reduced pressures from 0.21 to 0.87). The updated flow pattern map was compared to independent experimental data of flow patterns for CO2 in the literature and it predicts the flow patterns well. Then, a database of CO2 two-phase flow pressure drop results from the literature was set up and the database was compared to the leading empirical pressure drop models: the correlations by Chisholm [3], Friedel [4], Gronnerud [5] and Muller-Steinhagen and Heck [6], a modified Chisholm correlation by Yoon et al. [7] and the flow pattern based model of Moreno Quiben and Thome [8-10]. None of these models was able to predict the CO2 pressure drop data well. Therefore, a new flow pattern based phenomenological model of two-phase flow frictional pressure drop for CO2 was developed by modifying the model of Moreno Quiben and Thome using the updated flow pattern map in this study and it predicts the CO2 pressure drop database quite well overall. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Corresponding to the updated flow pattern map presented in Part I of this study, an updated general flow pattern based flow boiling heat transfer model was developed for CO2 using the Cheng-Ribatski-Wojtan-Thome [L. Cheng, G. Ribatski, L. Wojtan, J.R. Thome, New flow boiling heat transfer model and flow pattern map for carbon dioxide evaporating inside horizontal tubes, Int. J. Heat Mass Transfer 49 (2006) 4082-4094; L. Cheng, G. Ribatski, L. Wojtan, J.R. Thome, Erratum to: ""New flow boiling heat transfer model and flow pattern map for carbon dioxide evaporating inside tubes"" [Heat Mass Transfer 49 (21-22) (2006) 4082-4094], Int. J. Heat Mass Transfer 50 (2007) 391] flow boiling heat transfer model as the starting basis. The flow boiling heat transfer correlation in the dryout region was updated. In addition, a new mist flow heat transfer correlation for CO2 was developed based on the CO2 data and a heat transfer method for bubbly flow was proposed for completeness sake. The updated general flow boiling heat transfer model for CO2 covers all flow regimes and is applicable to a wider range of conditions for horizontal tubes: tube diameters from 0.6 to 10 mm, mass velocities from 50 to 1500 kg/m(2) s, heat fluxes from 1.8 to 46 kW/m(2) and saturation temperatures from -28 to 25 degrees C (reduced pressures from 0.21 to 0.87). The updated general flow boiling heat transfer model was compared to a new experimental database which contains 1124 data points (790 more than that in the previous model [Cheng et al., 2006, 2007]) in this study. Good agreement between the predicted and experimental data was found in general with 71.4% of the entire database and 83.2% of the database without the dryout and mist flow data predicted within +/-30%. However, the predictions for the dryout and mist flow regions were less satisfactory due to the limited number of data points, the higher inaccuracy in such data, scatter in some data sets ranging up to 40%, significant discrepancies from one experimental study to another and the difficulties associated with predicting the inception and completion of dryout around the perimeter of the horizontal tubes. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The objective of this study was to verify the possible inclusion of the Salmonella/microsome mutagenicity assay in a groundwater monitoring program as a complementary assay to assess water quality. Groundwater samples belonging to seven wells from different types of aquifers were analyzed. Three different methods for sample preparation were used: membrane filtration; liquid-liquid and XAD-4 extraction. The filtered samples were tested using TA98, TA100, YG1041 and YG1042 and the water extracts only with TA98 and TA100. No mutagenic activity was observed in any of the 16 filtered samples tested. Out of the 10 samples analyzed using XAD-4 extraction, five showed mutagenic activity with potency ranging from 130 to 1500 revertants/L. Concerning the liquid-liquid extraction, from the 11 samples analyzed, 3 showed mutagenicity. The XAD-4 extraction was the most suitable sample preparation. TA98 without S9 was found to be the most sensitive testing condition. The wells presenting water samples with mutagenic activity belonged to unconfined aquifers, which are more vulnerable to contamination. The data suggest that Salmonella/microsome assay can be used as an efficient screening tool to monitor groundwater for mutagenic activity. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Group criticisms judged to be reasonable in the mouth of an ingroup member are aggressively rejected when they stem from an outgroup member (the intergroup sensitivity effect). Mediational analyses suggest that this phenomenon is underpinned by an attributional bias; criticisms from insiders are more likely to be perceived as being motivated for constructive reasons than are criticisms from outsiders, thus arousing lower levels of defensiveness. But what if group members were to receive information that called into question the ingroup critic's commitment to the group? For example, if the ingroup critic was known to be a low identifier with their group, or used language to suggest that they were psychologically distancing themselves from their group, we might expect that ingroup critics will be downgraded as strongly as outgroup critics. Furthermore, it might be possible for people to turn an outgroup criticism into an ingroup criticism by making salient their shared identity at the superordinate level. Three experiments are described that provide support for each of these propositions.
Resumo:
Liver transplantation increased 1.84-fold from 1988 to 2004. However, the number of patients on the waiting list for a liver increased 2.71-fold, from 553 to 1500. We used a mathematical equation to analyze the potential effect of using ABO-compatible living-donor liver transplantation (LDLT) on both our liver transplantation program and the waiting list. We calculated the prevalence distribution of blood groups (O, A, B, and AB) in the population and the probability of having a compatible parent or sibling for LDLT. The incidence of ABO compatibility in the overall population was as follows: A, 0.31; B, 0.133; O, 0.512; and AB, 0.04. The ABO compatibility for parent donors was blood group A, 0.174; B, 0.06; O, 0.152; and AB, 0.03; and for sibling donors was A, 0.121; B, 0.05; O, 0.354; and AB, 0.03. Use of LDLT can reduce the pressure on our liver transplantation waiting list by decreasing its size by at least 16.5% at 20 years after its introduction. Such a program could save an estimated 3600 lives over the same period.
Resumo:
Liver transplantation was first performed at the University of Sao Paulo School of Medicine in 1968. Since then, the patient waiting list for liver transplantation has increased at a rate of 150 new cases per month. Liver transplantation itself rose 1.84-fold (from 160 to 295) from 1988 to 2004. However, the number of patients on the liver waiting list jumped 2.71-fold (from 553 to 1500). Consequently, the number of deaths on the liver waiting list moved to a higher level, from 321 to 671, increasing 2.09-fold. We have applied a mathematical model to analyze the potential impact of using a donation after cardiac death (DCD) policy on our liver transplantation program and on the waiting list. Five thousand one hundred people died because of accidents and other violent causes in our state in 2004; of these, only 295 were donors of liver grafts that were transplanted. The model assumed that 5% of these grafts would have been DCD. We found a relative reduction of 27% in the size of the liver transplantation waiting list if DCD had been used by assuming that 248 additional liver transplants would have been performed annually. In conclusion, the use of DCD in our transplantation program would reduce the pressure on our liver transplantation waiting list, reducing it by at least 27%. On the basis of this model, the projected number of averted deaths is about 41,487 in the next 20 years. Liver Transpl 14:1732-1736, 2008. (C) 2008 AASLD.