916 resultados para transmission of data and images
Resumo:
Ce mémoire de maîtrise a été rédigé dans l’objectif d’explorer une inégalité. Une inégalité dans les pratiques liées à la saisie et l’exploitation des données utilisateur dans la sphère des technologies et services Web, plus particulièrement dans la sphère des GIS (Geographic Information Systems). En 2014, de nombreuses entreprises exploitent les données de leurs utilisateurs afin d’améliorer leurs services ou générer du revenu publicitaire. Du côté de la sphère publique et gouvernementale, ce changement n’a pas été effectué. Ainsi, les gouvernements fédéraux et municipaux sont démunis de données qui permettraient d’améliorer les infrastructures et services publics. Des villes à travers le monde essayent d’améliorer leurs services et de devenir « intelligentes » mais sont dépourvues de ressources et de savoir faire pour assurer une transition respectueuse de la vie privée et des souhaits des citadins. Comment une ville peut-elle créer des jeux de données géo-référencés sans enfreindre les droits des citadins ? Dans l’objectif de répondre à ces interrogations, nous avons réalisé une étude comparative entre l’utilisation d’OpenStreetMap (OSM) et de Google Maps (GM). Grâce à une série d’entretiens avec des utilisateurs de GM et d’OSM, nous avons pu comprendre les significations et les valeurs d’usages de ces deux plateformes. Une analyse mobilisant les concepts de l’appropriation, de l’action collective et des perspectives critiques variées nous a permis d’analyser nos données d’entretiens pour comprendre les enjeux et problèmes derrière l’utilisation de technologies de géolocalisation, ainsi que ceux liés à la contribution des utilisateurs à ces GIS. Suite à cette analyse, la compréhension de la contribution et de l’utilisation de ces services a été recontextualisée pour explorer les moyens potentiels que les villes ont d’utiliser les technologies de géolocalisation afin d’améliorer leurs infrastructures publiques en respectant leurs citoyens.
Resumo:
Learning Disability (LD) is a classification including several disorders in which a child has difficulty in learning in a typical manner, usually caused by an unknown factor or factors. LD affects about 15% of children enrolled in schools. The prediction of learning disability is a complicated task since the identification of LD from diverse features or signs is a complicated problem. There is no cure for learning disabilities and they are life-long. The problems of children with specific learning disabilities have been a cause of concern to parents and teachers for some time. The aim of this paper is to develop a new algorithm for imputing missing values and to determine the significance of the missing value imputation method and dimensionality reduction method in the performance of fuzzy and neuro fuzzy classifiers with specific emphasis on prediction of learning disabilities in school age children. In the basic assessment method for prediction of LD, checklists are generally used and the data cases thus collected fully depends on the mood of children and may have also contain redundant as well as missing values. Therefore, in this study, we are proposing a new algorithm, viz. the correlation based new algorithm for imputing the missing values and Principal Component Analysis (PCA) for reducing the irrelevant attributes. After the study, it is found that, the preprocessing methods applied by us improves the quality of data and thereby increases the accuracy of the classifiers. The system is implemented in Math works Software Mat Lab 7.10. The results obtained from this study have illustrated that the developed missing value imputation method is very good contribution in prediction system and is capable of improving the performance of a classifier.
Resumo:
This study examines current and forthcoming measures related to the exchange of data and information in EU Justice and Home Affairs policies, with a focus on the ‘smart borders’ initiative. It argues that there is no reversibility in the growing reliance on such schemes and asks whether current and forthcoming proposals are necessary and original. It outlines the main challenges raised by the proposals, including issues related to the right to data protection, but also to privacy and non-discrimination.
Resumo:
The accuracy of medicine use information was compared for a telephone interview and mail questionnaire, using an in-home medicine check as the standard of assessment The validity of medicine use information varied by data source, level of specificity of data, and respondent characteristics. The mail questionnaire was the more valid source of overall medicine use information. Implications for both service providers and researchers are provided.
Resumo:
The implications of the new research presented in Volume 2, Issue 1 (Human Trafficking) of the Journal of Applied Research on Children are explored, calling attention to the need for increased awareness, greater availability of data, and proactive policy solutions to combat child trafficking.
Resumo:
Corrosion of reinforcing steel in concrete due to chloride ingress is one of the main causes of the deterioration of reinforced concrete structures. Structures most affected by such a corrosion are marine zone buildings and structures exposed to de-icing salts like highways and bridges. Such process is accompanied by an increase in volume of the corrosión products on the rebarsconcrete interface. Depending on the level of oxidation, iron can expand as much as six times its original volume. This increase in volume exerts tensile stresses in the surrounding concrete which result in cracking and spalling of the concrete cover if the concrete tensile strength is exceeded. The mechanism by which steel embedded in concrete corrodes in presence of chloride is the local breakdown of the passive layer formed in the highly alkaline condition of the concrete. It is assumed that corrosion initiates when a critical chloride content reaches the rebar surface. The mathematical formulation idealized the corrosion sequence as a two-stage process: an initiation stage, during which chloride ions penetrate to the reinforcing steel surface and depassivate it, and a propagation stage, in which active corrosion takes place until cracking of the concrete cover has occurred. The aim of this research is to develop computer tools to evaluate the duration of the service life of reinforced concrete structures, considering both the initiation and propagation periods. Such tools must offer a friendly interface to facilitate its use by the researchers even though their background is not in numerical simulation. For the evaluation of the initiation period different tools have been developed: Program TavProbabilidade: provides means to carry out a probability analysis of a chloride ingress model. Such a tool is necessary due to the lack of data and general uncertainties associated with the phenomenon of the chloride diffusion. It differs from the deterministic approach because it computes not just a chloride profile at a certain age, but a range of chloride profiles for each probability or occurrence. Program TavProbabilidade_Fiabilidade: carries out reliability analyses of the initiation period. It takes into account the critical value of the chloride concentration on the steel that causes breakdown of the passive layer and the beginning of the propagation stage. It differs from the deterministic analysis in that it does not predict if the corrosion is going to begin or not, but to quantifies the probability of corrosion initiation. Program TavDif_1D: was created to do a one dimension deterministic analysis of the chloride diffusion process by the finite element method (FEM) which numerically solves Fick’second Law. Despite of the different FEM solver already developed in one dimension, the decision to create a new code (TavDif_1D) was taken because of the need to have a solver with friendly interface for pre- and post-process according to the need of IETCC. An innovative tool was also developed with a systematic method devised to compare the ability of the different 1D models to predict the actual evolution of chloride ingress based on experimental measurements, and also to quantify the degree of agreement of the models with each others. For the evaluation of the entire service life of the structure: a computer program has been developed using finite elements method to do the coupling of both service life periods: initiation and propagation. The program for 2D (TavDif_2D) allows the complementary use of two external programs in a unique friendly interface: • GMSH - an finite element mesh generator and post-processing viewer • OOFEM – a finite element solver. This program (TavDif_2D) is responsible to decide in each time step when and where to start applying the boundary conditions of fracture mechanics module in function of the amount of chloride concentration and corrosion parameters (Icorr, etc). This program is also responsible to verify the presence and the degree of fracture in each element to send the Information of diffusion coefficient variation with the crack width. • GMSH - an finite element mesh generator and post-processing viewer • OOFEM – a finite element solver. The advantages of the FEM with the interface provided by the tool are: • the flexibility to input the data such as material property and boundary conditions as time dependent function. • the flexibility to predict the chloride concentration profile for different geometries. • the possibility to couple chloride diffusion (initiation stage) with chemical and mechanical behavior (propagation stage). The OOFEM code had to be modified to accept temperature, humidity and the time dependent values for the material properties, which is necessary to adequately describe the environmental variations. A 3-D simulation has been performed to simulate the behavior of the beam on both, action of the external load and the internal load caused by the corrosion products, using elements of imbedded fracture in order to plot the curve of the deflection of the central region of the beam versus the external load to compare with the experimental data.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Purpose – The purpose of this editorial is to stimulate debate and discussion amongst marketing scholarship regarding the implications for scientific research of increasingly large amounts of data and sophisticated data analytic techniques. Design/methodology/approach – The authors respond to a recent editorial in WIRED magazine which heralds the demise of the scientific method in the face of the vast data sets now available. Findings – The authors propose that more data makes theory more important, not less. They differentiate between raw prediction and scientific knowledge – which is aimed at explanation. Research limitations/implications – These thoughts are preliminary and intended to spark thinking and debate, not represent editorial policy. Due to space constraints, the coverage of many issues is necessarily brief. Practical implications – Marketing researchers should find these thoughts at the very least stimulating, and may wish to investigate these issues further. Originality/value – This piece should provide some interesting food for thought for marketing researchers.
Resumo:
The question of forming aim-oriented description of an object domain of decision support process is outlined. Two main problems of an estimation and evaluation of data and knowledge uncertainty in decision support systems – straight and reverse, are formulated. Three conditions being the formalized criteria of aimoriented constructing of input, internal and output spaces of some decision support system are proposed. Definitions of appeared and hidden data uncertainties on some measuring scale are given.
Resumo:
Владимир Димитров - Целта на настоящия доклад е формалната спецификация на релационния модел на данни. Тази спецификация след това може да бъде разширена към Обектно-релационния модел на данни и към Потоците от данни.
Resumo:
Assertion is a speech act that stands at the intersection of the philosophy of language and social epistemology. It is a phenomenon that bears on such wide-ranging topics as testimony, truth, meaning, knowledge and trust. It is thus no surprise that analytic philosophers have devoted innumerable pages to assertion, trying to give the norms that govern it, its role in the transmission of knowledge, and most importantly, what assertion is, or how assertion is to be defined. In this thesis I attempt to show that all previous answers to the question “What is assertion?” are flawed. There are four major traditions in the literature: constitutive norm theories of assertion, accounts that treat assertion as the expression of speaker attitudes, accounts that treat assertion as a proposal to add some proposition to the common ground, and accounts that treat assertion as the taking of responsibility for some claim. Each tradition is explored here, the leading theories within the tradition developed, and then placed under scrutiny to demonstrate flaws within the positions surveyed. I follow the work of G.E. Moore and William P. Alston, whilst drawing on the work of Robert Brandom in order to give a new bipartite theory of assertion. I argue that assertion consists in the explicit presentation of a proposition, along with a taking of responsibility for that proposition. Taking Alston's explicit presentation condition and repairing it in order to deal with problems it faces, whilst combining it with Brandom's responsibility condition, provides, I believe, the best account of assertion.
Resumo:
This thesis focuses on the history of the inflexional subjunctive and its functional substitutes in Late Middle English. To explore why and how the inflexional subjunctive declined in the history of English language, I analysed 2653 examples of three adverbial clauses introduced by if (1882 examples), though (305 examples) and lest (466 examples). Using a corpus-based approach, this thesis argues that linguistic change in subjunctive constructions did not happen suddenly but rather gradually, and the way it changed was varied , and that different constructions changed at different speeds in different environments. It is well known that the inflexional subjunctive declined in the history of English, mainly because of inflexional loss. Strangely however this topic has been comparatively neglected in the scholarly literature, especially with regard to the Middle English period, probably due to the limitations of data and also because study of this development requires very cumbersome textual research. This thesis has derived and analysed the data from three large corpora in the public domain: the Middle English Grammar Corpus (MEG-C for short), the Innsbruck Computer Archive of Machine-Readable English Texts (ICAMET for short), and some selected texts from The Corpus of Middle English Prose and Verse, part of the Middle English Compendium that also includes the Middle English Dictionary. The data were analysed from three perspectives: 1) clausal type, 2) dialect, and 3) textual genre. The basic methodology for the research was to analyse the examples one by one, with special attention being paid to the peculiarities of each text. In addition, this thesis draw on some complementary – indeed overlapping -- linguistic theories for further discussion: 1) Biber’s multi-dimensional theory, 2) Ogura and Wang’s (1994) S-curve or ‘diffusion’ theory, 3) Kretzchmar’s (2009) linguistics of speech, and 4) Halliday’s (1987) notion of language as a dynamic open system. To summarise the outcomes of this thesis: 1) On variation between clausal types, it was shown that the distributional tendencies of verb types (sub, ind, mod) are different between the three adverbial clauses under consideration. 2) On variation between dialects, it has been shown that the northern area, i.e. the so-called Great Scandinavian Belt, displays an especially high comparative ratio of the inflexional subjunctive construction compared to the other areas. This thesis suggests that this result was caused by the influence of Norse, relating the finding to the argument of Samuels (1989) that the present tense -es ending in the northern dialect was introduced by the influence of the Scandinavians. 3) On variation between genres, those labelled Science, Documents and Religion display relatively high ratio of the inflexional subjunctive, while Letter, Romance and History show relatively low ratio of the inflexional subjunctive. This results are explained by Biber’s multi-dimensional theory, which shows that the inflexional subjunctive can be related to the factors ‘informational’, ‘non-narrative’, ‘persuasive’ and ‘abstract’. 4) Lastly, on the inflexional subjunctive in Late Middle English, this thesis concludes that 1) the change did not happen suddenly but gradually, and 2) the way language changes varies. Thus the inflexional subjunctive did not disappear suddenly from England, and there was a time lag among the clausal types, dialects and genres, which can be related to Ogura and Wang’s S-curve (“diffusion”) theory and Kretzchmars’s view of “linguistic continuum”. This thesis has shown that the issues with regard to the inflexional subjunctive are quite complex, so that research in this area requires not only textual analysis but also theoretical analysis, considering both intra- and extra- linguistic factors.
Resumo:
Affiliated with a literary context in which the poet seeks reconciliation between the self and the universe without rejecting the consciousness of the poetic process and the renovation of language, the poems of Guimarães Rosa in his book Ave, Palavra, are close to what Octavio Paz called "poetry of convergence." Such analogical point of view turns into metaphor in the poems through the myth of Narcissus and images that associate a reflexion and the meeting with the Other as a way to know yourself. This work presents a reading of these poetic compositions by examining how the analogy is established, relating those poems to his other works, identifying them, not as an accident in his trajetory, but as a work that carries Guimarães Rosa’s concerns explored in his literary career.
Resumo:
The concept of EDI -Electronic Data Interchange is normally used to determine integration technologies between companies. Logistically, this implies in the integrationbetween enterprises in supply chain, involving the electronic transmission of data and thus reducing human intervention in the process, still favoring organizational performance. This study investigated the main benefits of EDI for organizational competitiveness and its possible impacton improving logistics performance of four companies comprising large national networks. Through multiple case studieswere able to identify common features that the use of EDI can favorer these companies. The results were significant and assume that the use of this tool can add value to logistics, primarily through streamlining processes, inventory optimization, cost reduction and performance improvement potential. Thus, the use of Electronic Data Interchange as a strategic tool for logistics proved, through this study, an efficient alternative for business improvement and good practice, able to leverage competitive advantages not only for individual companies, but also for the entire supply chain.