911 resultados para architectural design -- data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Products and services explicitly intended to influence users’ behaviour are increasingly being proposed to reduce environmental impact and for other areas of social benefit. Designing such interventions often involves adopting and adapting principles from other contexts where behaviour change has been studied. The ‘design pattern’ form, used in software engineering and HCI, and originally developed in architecture, offers benefits for this transposition process. This article introduces the Design with Intent toolkit, an idea generation method using a design pattern form to help designers address sustainable behaviour problems. The article also reports on exploratory workshops in which participants used the toolkit to generate concepts for redesigning everyday products—kettles, curtains, printers and bathroom sinks/taps—to reduce the environmental impact of use. The concepts are discussed, along with observations of how the toolkit was used by participants, suggesting usability improvements to incorporate in future versions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent advances in the massively parallel computational abilities of graphical processing units (GPUs) have increased their use for general purpose computation, as companies look to take advantage of big data processing techniques. This has given rise to the potential for malicious software targeting GPUs, which is of interest to forensic investigators examining the operation of software. The ability to carry out reverse-engineering of software is of great importance within the security and forensics elds, particularly when investigating malicious software or carrying out forensic analysis following a successful security breach. Due to the complexity of the Nvidia CUDA (Compute Uni ed Device Architecture) framework, it is not clear how best to approach the reverse engineering of a piece of CUDA software. We carry out a review of the di erent binary output formats which may be encountered from the CUDA compiler, and their implications on reverse engineering. We then demonstrate the process of carrying out disassembly of an example CUDA application, to establish the various techniques available to forensic investigators carrying out black-box disassembly and reverse engineering of CUDA binaries. We show that the Nvidia compiler, using default settings, leaks useful information. Finally, we demonstrate techniques to better protect intellectual property in CUDA algorithm implementations from reverse engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PEDRINI, Aldomar; SZOKOLAY, Steven. Recomendações para o desenvolvimento de uma ferramenta de suporte às primeiras decisões projetuais visando ao desempenho energético de edificações de escritório em clima quente. Ambiente Construído, Porto Alegre, v. 5, n. 1, p.39-54, jan./mar. 2005. Trimestral. Disponível em: . Acesso em: 04 out. 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research covers the topic of social housing and its relation to thermal comfort, so applied to an architectural and urban intervention in land situated in central urban area of Macaíba/RN, Brazil. Reflecting on the role of design and use of alternative building materials in the search for better performance is one of its main goals. The hypothesis is that by changing design parameters and choice of materials, it is possible to achieve better thermal performance results. Thus, we performed computer simulations of thermal performance and natural ventilation using computational fluid dynamics or CFD (Computational Fluid Dynamics). The presentation of the thermal simulation followed the methodology proposed in the dissertation Negreiros (2010), which aims to find the percentage of the amount of hours of comfort obtained throughout the year, while data analysis was made of natural ventilation from images generated by the images extracted from the CFD. From model building designed, was fitted an analytical framework that results in a comparison between three different proposals for dwellings housing model, which is evaluated the question of the thermal performance of buildings, and also deals with the spatial variables design, construction materials and costs. It is concluded that the final report confirmed the general hypotheses set at the start of the study, it was possible to quantify the results and identify the importance of design and construction materials are equivalent, and that, if combined, lead to gains in thermal performance potential.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relatório de Estágio apresentado à Escola Superior de Educação de Paula Frassinetti para obtenção do grau de Mestre em Educação Pré-Escolar e ensino do 1.ºCiclo do Ensino Básico

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este estudo tem como objectivo investigar o papel que as representações, construídas por alunos do 1.o ano de escolaridade, desempenham na resolução de problemas de Matemática. Mais concretamente, a presente investigação procura responder às seguintes questões: Que representações preferenciais utilizam os alunos para resolver problemas? De que forma é que as diferentes representações são influenciadas pelas estratégias de resolução de problemas utilizadas pelos alunos? Que papéis têm os diferentes tipos de representação na resolução dos problemas? Nesta investigação assume-se que a resolução de problemas constitui uma actividade muito importante na aprendizagem da Matemática no 1.o Ciclo do Ensino Básico. Os problemas devem ser variados, apelar a estratégias diversificadas de resolução e permitir diferentes representações por parte dos alunos. As representações cativas, icónicas e simbólicas constituem importantes ferramentas para os alunos organizarem, registarem e comunicarem as suas ideias matemáticas, nomeadamente no âmbito da resolução de problemas, servindo igualmente de apoio à compreensão de conceitos e relações matemáticas. A metodologia de investigação segue uma abordagem interpretativa tomando por design o estudo de caso. Trata-se simultaneamente de uma investigação sobre a própria prática, correspondendo os quatro estudos de caso a quatro alunos da turma de 1.0 ano de escolaridade da investigadora. A recolha de dados teve lugar durante o ano lectivo 2007/2008 e recorreu à observação, à análise de documentos, a diários, a registos áudio/vídeo e ainda a conversas com os alunos. A análise de dados que, numa primeira fase, acompanhou a recolha de dados, teve como base o problema e as questões da investigação bem como o referencial teórico que serviu de suporte à investigação. Com base no referencial teórico e durante o início do processo de análise, foram definidas as categorias de análise principais, sujeitas posteriormente a um processo de adequação e refinamento no decorrer da análise e tratamento dos dados recolhidos -com vista à construção dos casos em estudo. Os resultados desta investigação apontam as representações do tipo icónico e as do tipo simbólico como as representações preferenciais dos alunos, embora sejam utilizadas de formas diferentes, com funções distintas e em contextos diversos. Os elementos simbólicos apoiam-se frequentemente em elementos icónicos, sendo estes últimos que ajudam os alunos a descompactar o problema e a interpretá-lo. Nas representações icónicas enfatiza-se o papel do diagrama, o qual constitui uma preciosa ferramenta de apoio ao raciocínio matemático. Conclui-se ainda que enquanto as representações activas dão mais apoio a estratégias de resolução que envolvem simulação, as representações icónicas e simbólicas são utilizadas com estratégias diversificadas. As representações construídas, com papéis e funções diferentes entre si, e que desempenham um papel crucial na correcta interpretação e resolução dos problemas, parecem estar directamente relacionadas com as caraterísticas da tarefa proposta no que diz respeito às estruturas matemáticas envolvidas. ABSTRACT; The objective of the present study is to investigate the role of the representations constructed by 1st grade students in mathematical problem solving. More specifically, this research is oriented by the following questions: Which representations are preferably used by students to solve problems? ln which way the strategies adopted by the students in problem solving influence those distinct representations? What is the role of the distinct types of representation in the problems solving process? ln this research it is assumed that the resolution of problems is a very important activity in the Mathematics learning at the first cycle of basic education. The problems must be varied, appealing to diverse strategies of resolution and allow students to construct distinct representations. The active, iconic and symbolic representations are important tools for students to organize, to record and to communicate their mathematical ideas, particularly in problem solving context, as well as supporting the understanding of mathematical concepts and relationships. The adopted research methodology follows an interpretative approach, and was developed in the context of the researcher classroom, originating four case studies corresponding to four 1 st grade students of the researcher's class. Data collection was carried out during the academic year of 2007/2008 and was based on observation, analysis of documents, diaries, audio and video records and informal conversations with students. The initial data analysis was based on the problems and issues of research, as well in the theoretical framework that supports it. The main categories of analysis were defined based on the theoretical framework, and were subjected to a process of adaptation and refining during data processing and analysis aiming the -case studies construction. The results show that student's preferential representations are the iconic and the symbolic, although these types of representations are used in different ways, with different functions and in different contexts. The symbolic elements are often supported by iconic elements, the latter helping students to unpack the problem and interpret it. ln the iconic representations the role of the diagrams is emphasized, consisting in a valuable tool to support the mathematical reasoning. One can also conclude that while the active representations give more support to the resolution strategies involving simulation, the iconic and symbolic representations are preferably used with different strategies. The representations constructed with distinct roles and functions, are crucial in the proper interpretation and resolution of problems, and seem to be directly related to the characteristics of the proposed task with regard to the mathematical structures involved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The only method used to date to measure dissolved nitrate concentration (NITRATE) with sensors mounted on profiling floats is based on the absorption of light at ultraviolet wavelengths by nitrate ion (Johnson and Coletti, 2002; Johnson et al., 2010; 2013; D’Ortenzio et al., 2012). Nitrate has a modest UV absorption band with a peak near 210 nm, which overlaps with the stronger absorption band of bromide, which has a peak near 200 nm. In addition, there is a much weaker absorption due to dissolved organic matter and light scattering by particles (Ogura and Hanya, 1966). The UV spectrum thus consists of three components, bromide, nitrate and a background due to organics and particles. The background also includes thermal effects on the instrument and slow drift. All of these latter effects (organics, particles, thermal effects and drift) tend to be smooth spectra that combine to form an absorption spectrum that is linear in wavelength over relatively short wavelength spans. If the light absorption spectrum is measured in the wavelength range around 217 to 240 nm (the exact range is a bit of a decision by the operator), then the nitrate concentration can be determined. Two different instruments based on the same optical principles are in use for this purpose. The In Situ Ultraviolet Spectrophotometer (ISUS) built at MBARI or at Satlantic has been mounted inside the pressure hull of a Teledyne/Webb Research APEX and NKE Provor profiling floats and the optics penetrate through the upper end cap into the water. The Satlantic Submersible Ultraviolet Nitrate Analyzer (SUNA) is placed on the outside of APEX, Provor, and Navis profiling floats in its own pressure housing and is connected to the float through an underwater cable that provides power and communications. Power, communications between the float controller and the sensor, and data processing requirements are essentially the same for both ISUS and SUNA. There are several possible algorithms that can be used for the deconvolution of nitrate concentration from the observed UV absorption spectrum (Johnson and Coletti, 2002; Arai et al., 2008; Sakamoto et al., 2009; Zielinski et al., 2011). In addition, the default algorithm that is available in Satlantic sensors is a proprietary approach, but this is not generally used on profiling floats. There are some tradeoffs in every approach. To date almost all nitrate sensors on profiling floats have used the Temperature Compensated Salinity Subtracted (TCSS) algorithm developed by Sakamoto et al. (2009), and this document focuses on that method. It is likely that there will be further algorithm development and it is necessary that the data systems clearly identify the algorithm that is used. It is also desirable that the data system allow for recalculation of prior data sets using new algorithms. To accomplish this, the float must report not just the computed nitrate, but the observed light intensity. Then, the rule to obtain only one NITRATE parameter is, if the spectrum is present then, the NITRATE should be recalculated from the spectrum while the computation of nitrate concentration can also generate useful diagnostics of data quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The CATARINA Leg1 cruise was carried out from June 22 to July 24 2012 on board the B/O Sarmiento de Gamboa, under the scientific supervision of Aida Rios (CSIC-IIM). It included the occurrence of the OVIDE hydrological section that was performed in June 2002, 2004, 2006, 2008 and 2010, as part of the CLIVAR program (name A25) ), and under the supervision of Herlé Mercier (CNRSLPO). This section begins near Lisbon (Portugal), runs through the West European Basin and the Iceland Basin, crosses the Reykjanes Ridge (300 miles north of Charlie-Gibbs Fracture Zone, and ends at Cape Hoppe (southeast tip of Greenland). The objective of this repeated hydrological section is to monitor the variability of water mass properties and main current transports in the basin, complementing the international observation array relevant for climate studies. In addition, the Labrador Sea was partly sampled (stations 101-108) between Greenland and Newfoundland, but heavy weather conditions prevented the achievement of the section south of 53°40’N. The quality of CTD data is essential to reach the first objective of the CATARINA project, i.e. to quantify the Meridional Overturning Circulation and water mass ventilation changes and their effect on the changes in the anthropogenic carbon ocean uptake and storage capacity. The CATARINA project was mainly funded by the Spanish Ministry of Sciences and Innovation and co-funded by the Fondo Europeo de Desarrollo Regional. The hydrological OVIDE section includes 95 surface-bottom stations from coast to coast, collecting profiles of temperature, salinity, oxygen and currents, spaced by 2 to 25 Nm depending on the steepness of the topography. The position of the stations closely follows that of OVIDE 2002. In addition, 8 stations were carried out in the Labrador Sea. From the 24 bottles closed at various depth at each stations, samples of sea water are used for salinity and oxygen calibration, and for measurements of biogeochemical components that are not reported here. The data were acquired with a Seabird CTD (SBE911+) and an SBE43 for the dissolved oxygen, belonging to the Spanish UTM group. The software SBE data processing was used after decoding and cleaning the raw data. Then, the LPO matlab toolbox was used to calibrate and bin the data as it was done for the previous OVIDE cruises, using on the one hand pre and post-cruise calibration results for the pressure and temperature sensors (done at Ifremer) and on the other hand the water samples of the 24 bottles of the rosette at each station for the salinity and dissolved oxygen data. A final accuracy of 0.002°C, 0.002 psu and 0.04 ml/l (2.3 umol/kg) was obtained on final profiles of temperature, salinity and dissolved oxygen, compatible with international requirements issued from the WOCE program.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research covers the topic of social housing and its relation to thermal comfort, so applied to an architectural and urban intervention in land situated in central urban area of Macaíba/RN, Brazil. Reflecting on the role of design and use of alternative building materials in the search for better performance is one of its main goals. The hypothesis is that by changing design parameters and choice of materials, it is possible to achieve better thermal performance results. Thus, we performed computer simulations of thermal performance and natural ventilation using computational fluid dynamics or CFD (Computational Fluid Dynamics). The presentation of the thermal simulation followed the methodology proposed in the dissertation Negreiros (2010), which aims to find the percentage of the amount of hours of comfort obtained throughout the year, while data analysis was made of natural ventilation from images generated by the images extracted from the CFD. From model building designed, was fitted an analytical framework that results in a comparison between three different proposals for dwellings housing model, which is evaluated the question of the thermal performance of buildings, and also deals with the spatial variables design, construction materials and costs. It is concluded that the final report confirmed the general hypotheses set at the start of the study, it was possible to quantify the results and identify the importance of design and construction materials are equivalent, and that, if combined, lead to gains in thermal performance potential.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En Aerofotogrametría, el proceso de restitución (paso de imagen a formato electrónico vectorizado) es realizado por un operador humano, con asistencia de hardware y Software especializado -- Dicho proceso implica la traducción de accidentes geográficos, detalles topográficos, etc., la cual conlleva errores tanto geométricos (precisión) como topológicos (conectividad) de los datos digitales vectorizados -- Adicionalmente, aun si la vectorizacion es perfecta, los editores en etapas subsecuentes deben realizar tareas repetitivas: formateo, marcado, ajuste de convenciones, etc., que por el tamaño de los archivos de datos se hacen prolongadas y propensas al error -- Tanto los procesos de corrección como de formateo y marcado requieren además la ejecución de entradas / salidas con el usuario en el computador, proceso que es particularmente lento -- Esta investigación presenta el desarrollo de herramientas automáticas de (i) detección y corrección de errores comunes en los planos restituidos, (ii) partición y re-agrupación inteligentes de planos grandes, y (iii) formateo y marcado automático -- El desarrollo de software se hace usando el standard AIS (Application Interface Specification), lo que lo hace portable a los modeladores cuya interface AIS haya sido implementada -- El proyecto se desarrolla para la firma AeroEstudios LTDA de Colombia, la cual lo ha incorporado a sus herramientas de procesamiento de información digital

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis builds a framework for evaluating downside risk from multivariate data via a special class of risk measures (RM). The peculiarity of the analysis lies in getting rid of strong data distributional assumptions and in orientation towards the most critical data in risk management: those with asymmetries and heavy tails. At the same time, under typical assumptions, such as the ellipticity of the data probability distribution, the conformity with classical methods is shown. The constructed class of RM is a multivariate generalization of the coherent distortion RM, which possess valuable properties for a risk manager. The design of the framework is twofold. The first part contains new computational geometry methods for the high-dimensional data. The developed algorithms demonstrate computability of geometrical concepts used for constructing the RM. These concepts bring visuality and simplify interpretation of the RM. The second part develops models for applying the framework to actual problems. The spectrum of applications varies from robust portfolio selection up to broader spheres, such as stochastic conic optimization with risk constraints or supervised machine learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O Relatório de estágio tem por desígnio, compreender e analisar, o statu quo das dimensões da satisfação (Contemplando o seu bem-estar pessoal e social) dos utentes das valências da Santa Casa da Misericórdia de Évora, por conseguinte, analisando e compreendendo a qualidade de vida desses mesmos utentes, debruçando-se, pois, sobre as potenciais e/ou evidentes necessidades institucionais por suprimir, e problemas sociais por erradicar. O levantamento da actual informação obteve-se através de dois instrumentos metodológicos de recolha de dados: inquérito por questionário de aplicação directa auto-administrada (utentes), e por intermédio de entrevistas semi-estruturadas (diretores técnicos das valências). Relativamente, aos métodos de tratamento de dados, no primeiro instrumento utilizou-se IBM SPSS 22, já no segundo análise de conteúdo. Após a detecção das necessidades existentes, procedeu-se a elaboração de uma proposta de intervenção sócio-organizacional, por forma a solucionar essas mesmas necessidades existentes. O presente documento estrutura-se em 4 eixos: o Eixo 1 – Componente teórica - ; o Eixo 2 – Componente metodológica - ; Eixo 3 – Componente dos resultados - ; Eixo 4 – Componente de intervenção sócio- organizacional; Abstract: Intervention of the Holy House of Mercy of Evora - The quality of the Social Responses and satisfaction perceived Internal SCME of users -. This work is to design, understand and analyze the status quo of the dimensions of satisfaction (Contemplating their personal and social welfare) of users of the valences of the Holy House of Mercy of Evora, therefore, analyzing and understanding the quality life of these same users, addressing therefore, on potential and / or obvious institutional needs to suppress and eradicate social problems. The survey of current information was obtained through two methodological tools for data collection: survey by self-administered direct application questionnaire (users), and through semi-structured interviews (technical directors of valences). With regard to the data processing methods, the first instrument was used SPSS 22, in the second content analysis. After detection of the needs, it proceeded to draw up a proposal for social and organizational intervention in order to solve these same existing needs. This document is divided into four axes: Axis 1 - Theoretical component -; Axis 2 - methodological component -; Axis 3 - Component of results -; Axis 4 - socio-organizational intervention component.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analytics is the technology working with the manipulation of data to produce information able to change the world we live every day. Analytics have been largely used within the last decade to cluster people’s behaviour to predict their preferences of items to buy, music to listen, movies to watch and even electoral preference. The most advanced companies succeded in controlling people’s behaviour using analytics. Despite the evidence of the super-power of analytics, they are rarely applied to the big data collected within supply chain systems (i.e. distribution network, storage systems and production plants). This PhD thesis explores the fourth research paradigm (i.e. the generation of knowledge from data) applied to supply chain system design and operations management. An ontology defining the entities and the metrics of supply chain systems is used to design data structures for data collection in supply chain systems. The consistency of this data is provided by mathematical demonstrations inspired by the factory physics theory. The availability, quantity and quality of the data within these data structures define different decision patterns. Ten decision patterns are identified, and validated on-field, to address ten different class of design and control problems in the field of supply chain systems research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this novel experimental study is to investigate the behaviour of a 2m x 2m model of a masonry groin vault, which is built by the assembly of blocks made of a 3D-printed plastic skin filled with mortar. The choice of the groin vault is due to the large presence of this vulnerable roofing system in the historical heritage. Experimental tests on the shaking table are carried out to explore the vault response on two support boundary conditions, involving four lateral confinement modes. The data processing of markers displacement has allowed to examine the collapse mechanisms of the vault, based on the arches deformed shapes. There then follows a numerical evaluation, to provide the orders of magnitude of the displacements associated to the previous mechanisms. Given that these displacements are related to the arches shortening and elongation, the last objective is the definition of a critical elongation between two diagonal bricks and consequently of a diagonal portion. This study aims to continue the previous work and to take another step forward in the research of ground motion effects on masonry structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the CERN LHC program underway, there has been an acceleration of data growth in the High Energy Physics (HEP) field and the usage of Machine Learning (ML) in HEP will be critical during the HL-LHC program when the data that will be produced will reach the exascale. ML techniques have been successfully used in many areas of HEP nevertheless, the development of a ML project and its implementation for production use is a highly time-consuming task and requires specific skills. Complicating this scenario is the fact that HEP data is stored in ROOT data format, which is mostly unknown outside of the HEP community. The work presented in this thesis is focused on the development of a ML as a Service (MLaaS) solution for HEP, aiming to provide a cloud service that allows HEP users to run ML pipelines via HTTP calls. These pipelines are executed by using the MLaaS4HEP framework, which allows reading data, processing data, and training ML models directly using ROOT files of arbitrary size from local or distributed data sources. Such a solution provides HEP users non-expert in ML with a tool that allows them to apply ML techniques in their analyses in a streamlined manner. Over the years the MLaaS4HEP framework has been developed, validated, and tested and new features have been added. A first MLaaS solution has been developed by automatizing the deployment of a platform equipped with the MLaaS4HEP framework. Then, a service with APIs has been developed, so that a user after being authenticated and authorized can submit MLaaS4HEP workflows producing trained ML models ready for the inference phase. A working prototype of this service is currently running on a virtual machine of INFN-Cloud and is compliant to be added to the INFN Cloud portfolio of services.