898 resultados para Theories and Models


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The availability of a huge amount of source code from code archives and open-source projects opens up the possibility to merge machine learning, programming languages, and software engineering research fields. This area is often referred to as Big Code where programming languages are treated instead of natural languages while different features and patterns of code can be exploited to perform many useful tasks and build supportive tools. Among all the possible applications which can be developed within the area of Big Code, the work presented in this research thesis mainly focuses on two particular tasks: the Programming Language Identification (PLI) and the Software Defect Prediction (SDP) for source codes. Programming language identification is commonly needed in program comprehension and it is usually performed directly by developers. However, when it comes at big scales, such as in widely used archives (GitHub, Software Heritage), automation of this task is desirable. To accomplish this aim, the problem is analyzed from different points of view (text and image-based learning approaches) and different models are created paying particular attention to their scalability. Software defect prediction is a fundamental step in software development for improving quality and assuring the reliability of software products. In the past, defects were searched by manual inspection or using automatic static and dynamic analyzers. Now, the automation of this task can be tackled using learning approaches that can speed up and improve related procedures. Here, two models have been built and analyzed to detect some of the commonest bugs and errors at different code granularity levels (file and method levels). Exploited data and models’ architectures are analyzed and described in detail. Quantitative and qualitative results are reported for both PLI and SDP tasks while differences and similarities concerning other related works are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This dissertation explores the entanglement between the visionary capacity of feminist theory to shape sustainable futures and the active contribution of feminist speculative fiction to the conceptual debate about the climate crisis. Over the last few years, increasing critical attention has been paid to ecofeminist perspectives on climate change, that see as a core cause of the climate crisis the patriarchal domination of nature, considered to go hand in hand with the oppression of women. What remains to be thoroughly scrutinised is the linkage between ecofeminist theories and other ethical stances capable of countering colonising epistemologies of mastery and dominion over nature. This dissertation intervenes in the debate about the master narrative of the Anthropocene, and about the one-dimensional perspective that often characterises its literary representations, from a feminist perspective that also aims at decolonising the imagination; it looks at literary texts that consider patriarchal domination of nature in its intersections with other injustices that play out within the Anthropocene, with a particular focus on race, colonialism, and capitalism. After an overview of the linkages between gender and climate change and between feminism and environmental humanities, it introduces the genre of climate fiction examining its main tropes. In an attempt to find alternatives to the mainstream narrative of the Anthropocene (namely to its gender-neutrality, colour-blindness, and anthropocentrism), it focuses on contemporary works of speculative fiction by four Anglophone women authors that particularly address the inequitable impacts of climate change experienced not only by women, but also by sexualised, racialised, and naturalised Others. These texts were chosen because of their specific engagement with the relationship between climate change, global capitalism, and a flat trust in techno-fixes on the one hand, and structural inequalities generated by patriarchy, racism, and intersecting systems of oppression on the other.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The General Data Protection Regulation (GDPR) has been designed to help promote a view in favor of the interests of individuals instead of large corporations. However, there is the need of more dedicated technologies that can help companies comply with GDPR while enabling people to exercise their rights. We argue that such a dedicated solution must address two main issues: the need for more transparency towards individuals regarding the management of their personal information and their often hindered ability to access and make interoperable personal data in a way that the exercise of one's rights would result in straightforward. We aim to provide a system that helps to push personal data management towards the individual's control, i.e., a personal information management system (PIMS). By using distributed storage and decentralized computing networks to control online services, users' personal information could be shifted towards those directly concerned, i.e., the data subjects. The use of Distributed Ledger Technologies (DLTs) and Decentralized File Storage (DFS) as an implementation of decentralized systems is of paramount importance in this case. The structure of this dissertation follows an incremental approach to describing a set of decentralized systems and models that revolves around personal data and their subjects. Each chapter of this dissertation builds up the previous one and discusses the technical implementation of a system and its relation with the corresponding regulations. We refer to the EU regulatory framework, including GDPR, eIDAS, and Data Governance Act, to build our final system architecture's functional and non-functional drivers. In our PIMS design, personal data is kept in a Personal Data Space (PDS) consisting of encrypted personal data referring to the subject stored in a DFS. On top of that, a network of authorization servers acts as a data intermediary to provide access to potential data recipients through smart contracts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The term Artificial intelligence acquired a lot of baggage since its introduction and in its current incarnation is synonymous with Deep Learning. The sudden availability of data and computing resources has opened the gates to myriads of applications. Not all are created equal though, and problems might arise especially for fields not closely related to the tasks that pertain tech companies that spearheaded DL. The perspective of practitioners seems to be changing, however. Human-Centric AI emerged in the last few years as a new way of thinking DL and AI applications from the ground up, with a special attention at their relationship with humans. The goal is designing a system that can gracefully integrate in already established workflows, as in many real-world scenarios AI may not be good enough to completely replace its humans. Often this replacement may even be unneeded or undesirable. Another important perspective comes from, Andrew Ng, a DL pioneer, who recently started shifting the focus of development from “better models” towards better, and smaller, data. He defined his approach Data-Centric AI. Without downplaying the importance of pushing the state of the art in DL, we must recognize that if the goal is creating a tool for humans to use, more raw performance may not align with more utility for the final user. A Human-Centric approach is compatible with a Data-Centric one, and we find that the two overlap nicely when human expertise is used as the driving force behind data quality. This thesis documents a series of case-studies where these approaches were employed, to different extents, to guide the design and implementation of intelligent systems. We found human expertise proved crucial in improving datasets and models. The last chapter includes a slight deviation, with studies on the pandemic, still preserving the human and data centric perspective.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this dissertation is to describe the methodologies required to design, operate, and validate the performance of ground stations dedicated to near and deep space tracking, as well as the models developed to process the signals acquired, from raw data to the output parameters of the orbit determination of spacecraft. This work is framed in the context of lunar and planetary exploration missions by addressing the challenges in receiving and processing radiometric data for radio science investigations and navigation purposes. These challenges include the designing of an appropriate back-end to read, convert and store the antenna voltages, the definition of appropriate methodologies for pre-processing, calibration, and estimation of radiometric data for the extraction of information on the spacecraft state, and the definition and integration of accurate models of the spacecraft dynamics to evaluate the goodness of the recorded signals. Additionally, the experimental design of acquisition strategies to perform direct comparison between ground stations is described and discussed. In particular, the evaluation of the differential performance between stations requires the designing of a dedicated tracking campaign to maximize the overlap of the recorded datasets at the receivers, making it possible to correlate the received signals and isolate the contribution of the ground segment to the noise in the single link. Finally, in support of the methodologies and models presented, results from the validation and design work performed on the Deep Space Network (DSN) affiliated nodes DSS-69 and DSS-17 will also be reported.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This dissertation investigates the relations between logic and TCS in the probabilistic setting. It is motivated by two main considerations. On the one hand, since their appearance in the 1960s-1970s, probabilistic models have become increasingly pervasive in several fast-growing areas of CS. On the other, the study and development of (deterministic) computational models has considerably benefitted from the mutual interchanges between logic and CS. Nevertheless, probabilistic computation was only marginally touched by such fruitful interactions. The goal of this thesis is precisely to (start) bring(ing) this gap, by developing logical systems corresponding to specific aspects of randomized computation and, therefore, by generalizing standard achievements to the probabilistic realm. To do so, our key ingredient is the introduction of new, measure-sensitive quantifiers associated with quantitative interpretations. The dissertation is tripartite. In the first part, we focus on the relation between logic and counting complexity classes. We show that, due to our classical counting propositional logic, it is possible to generalize to counting classes, the standard results by Cook and Meyer and Stockmeyer linking propositional logic and the polynomial hierarchy. Indeed, we show that the validity problem for counting-quantified formulae captures the corresponding level in Wagner's hierarchy. In the second part, we consider programming language theory. Type systems for randomized \lambda-calculi, also guaranteeing various forms of termination properties, were introduced in the last decades, but these are not "logically oriented" and no Curry-Howard correspondence is known for them. Following intuitions coming from counting logics, we define the first probabilistic version of the correspondence. Finally, we consider the relationship between arithmetic and computation. We present a quantitative extension of the language of arithmetic able to formalize basic results from probability theory. This language is also our starting point to define randomized bounded theories and, so, to generalize canonical results by Buss.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The notion of commodification is a fascinating one. It entails many facets, ranging from subjective debates on desirability of commodification to in depth economic analyses of objects of value and their corresponding markets. Commodity theory is therefore not just defined by a single debate, but spans a plethora of different discussions. This thesis maps and situates those theories and debates and selects one specific strain to investigate further. This thesis argues that commodity theory in its optima forma deals with the investigation into what sets commodities apart from non-commodities. It proceeds to examine the many given answers to this question by scholars ranging from the mid 1800’s to the late 2000’s. Ultimately, commodification is defined as a process in which an object becomes an element of the total wealth of societies in which the capitalist mode of production prevails. In doing so, objects must meet observables, or indicia, of commodification provided by commodity theories. Problems arise when objects are clearly part of the total wealth in societies without meeting established commodity indicia. In such cases, objects are part of the total wealth of a society without counting as a commodity. This thesis examines this phenomenon in relation to the novel commodities of audiences and data. It explains how these non-commodities (according to classical theories) are still essential elements of industry. The thesis then takes a deep dive into commodity theory using the theory on the construction of social reality by John Searle.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The dissertation explores the relationship between projects for urban blocks and the discourses on the city between the late 1960s and the 1980s, with a particular focus on the blocks of the Internationale Bauausstellung (IBA) Berlin 1979-87. The main research questions center on whether and how the block changed in connection with the emerging ideas of the city during this period and whether these changes had, in turn, effects on the whole city. Thus far, despite extensive research on the theories and the ideas of the city between the 1960s and 1980s, there is a lack of studies that interweave this research with insights into the block. To fill this gap, this dissertation examines how the block was thematized in the 1970s discourses on the city. It highlights projects for blocks designed between the late 1960s and the 70s in various European cities, particularly West Berlin. Then, it focuses on the blocks of the IBA Berlin 1979-87, examining them through theory, history, and drawings. The study of the examples reveals three distinctive aspects of all blocks considered in the dissertation: the overcoming of small private plots, the individualization of the buildings, and the accessibility of the courtyards from public streets. These aspects reflect the changing understandings of the city and of the urban spaces in the 1970s and 1980s, which resulted in new compositional logics of the block. When examined with critical distance, the blocks of the 1970s and 80s offer a lesson in architectural and urban composition which is still current. - The author has made every effort to contact the owners of the copyrights of the material in the dissertation. The author is available to the right holders with whom it was not possible to communicate as well as for any omissions or inaccuracies in quoting the sources.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As people spend a third of their lives at work and, in most cases, indoors, the work environment assumes crucial importance. The continuous and dynamic interaction between people and the working environment surrounding them produces physiological and psychological effects on operators. Recognizing the substantial impact of comfort and well-being on employee satisfaction and job performance, the literature underscores the need for industries to implement indoor environment control strategies to ensure long-term success and profitability. However, managing physical risks (i.e., ergonomic and microclimate) in industrial environments is often constrained by production and energy requirements. In the food processing industry, for example, the safety of perishable products dictates storage temperatures that do not allow for operator comfort. Conversely, warehouses dedicated to non-perishable products often lack cooling systems to limit energy expenditure, reaching high temperatures in the summer period. Moreover, exceptional events, like the COVID-19 pandemic, introduce new constraints, with recommendations impacting thermal stress and respiratory health. Furthermore, the thesis highlights how workers' variables, particularly the aging process, reduce tolerance to environmental stresses. Consequently, prolonged exposure to environmental stress conditions at work results in cardiovascular disease and musculoskeletal disorders. In response to the global trend of an aging workforce, the thesis bridges a literature gap by proposing methods and models that integrate the age factor into comfort assessment. It aims to present technical and technological solutions to mitigate microclimate risks in industrial environments, ultimately seeking innovative ways to enhance the aging workforce's comfort, performance, experience, and skills. The research outlines a logical-conceptual scheme with three main areas of focus: analyzing factors influencing the work environment, recognizing constraints to worker comfort, and designing solutions. The results significantly contribute to science by laying the foundation for new research in worker health and safety in an ageing working population's extremely current industrial context.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This PhD thesis focuses on studying the classical scattering of massive/massless particles toward black holes, and investigating double copy relations between classical observables in gauge theories and gravity. This is done in the Post-Minkowskian approximation i.e. a perturbative expansion of observables controlled by the gravitational coupling constant κ = 32πGN, with GN being the Newtonian coupling constant. The investigation is performed by using the Worldline Quantum Field Theory (WQFT), displaying a worldline path integral describing the scattering objects and a QFT path integral in the Born approximation, describing the intermediate bosons exchanged in the scattering event by the massive/massless particles. We introduce the WQFT, by deriving a relation between the Kosower- Maybee-O’Connell (KMOC) limit of amplitudes and worldline path integrals, then, we use that to study the classical Compton amplitude and higher point amplitudes. We also present a nice application of our formulation to the case of Hard Thermal Loops (HTL), by explicitly evaluating hard thermal currents in gauge theory and gravity. Next we move to the investigation of the classical double copy (CDC), which is a powerful tool to generate integrands for classical observables related to the binary inspiralling problem in General Relativity. In order to use a Bern-Carrasco-Johansson (BCJ) like prescription, straight at the classical level, one has to identify a double copy (DC) kernel, encoding the locality structure of the classical amplitude. Such kernel is evaluated by using a theory where scalar particles interacts through bi-adjoint scalars. We show here how to push forward the classical double copy so to account for spinning particles, in the framework of the WQFT. Here the quantization procedure on the worldline allows us to fully reconstruct the quantum theory on the gravitational side. Next we investigate how to describe the scattering of massless particles off black holes in the WQFT.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this thesis, I address quantum theories and specifically quantum field theories in their interpretive aspects, with the aim of capturing some of the most controversial and challenging issues, also in relation to possible future developments of physics. To do so, I rely on and review some of the discussions carried on in philosophy of physics, highlighting methodologies and goals. This makes the thesis an introduction to these discussions. Based on these arguments, I built and conducted 7 face-to-face interviews with physics professors and an online survey (which received 88 responses from master's and PhD students and postdoctoral researchers in physics), with the aim of understanding how physicists make sense of concepts related to quantum theories and to find out what they can add to the discussion. Of the data collected, I report a qualitative analysis through three constructed themes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Universidade Estadual de Campinas. Faculdade de Educação Física

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Universidade Estadual de Campinas . Faculdade de Educação Física

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Though introduced recently, complex networks research has grown steadily because of its potential to represent, characterize and model a wide range of intricate natural systems and phenomena. Because of the intrinsic complexity and systemic organization of life, complex networks provide a specially promising framework for systems biology investigation. The current article is an up-to-date review of the major developments related to the application of complex networks in biology, with special attention focused on the more recent literature. The main concepts and models of complex networks are presented and illustrated in an accessible fashion. Three main types of networks are covered: transcriptional regulatory networks, protein-protein interaction networks and metabolic networks. The key role of complex networks for systems biology is extensively illustrated by several of the papers reviewed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

SEVERAL MODELS OF TIME ESTIMATION HAVE BEEN developed in psychology; a few have been applied to music. In the present study, we assess the influence of the distances travelled through pitch space on retrospective time estimation. Participants listened to an isochronous chord sequence of 20-s duration. They were unexpectedly asked to reproduce the time interval of the sequence. The harmonic structure of the stimulus was manipulated so that the sequence either remained in the same key (CC) or travelled through a closely related key (CFC) or distant key (CGbC). Estimated times were shortened when the sequence modulated to a very distant key. This finding is discussed in light of Lerdahl's Tonal Pitch Space Theory (2001), Firmino and Bueno's Expected Development Fraction Model (in press), and models of time estimation.