995 resultados para FitMaster Workout Manager Software Windows SQL Server Centralizzato


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software educativo que combina las utilidades de un tutorial con la capacidad de elaborar ejercicios diversos, destinados al alumnado, que incluyen automatismos de evaluación: dictados, cuestionarios y ejercicios. Todos los materiales son tratados en formatos multimedia e hipertexto, entre los cuales se incluyen, ya desarrollados, paquetes de ejercicios de catalán, castellano, francés, inglés, un módulo sobre la Unión Europea, o ecología, entre varios más, desarrollados por diversos autores a partir de la aplicación. Ofrece la opción de desarrollar nuevos materiales bajo esta plataforma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resumen basado en el de la publicación

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Desarrollar y evaluar los nuevos materiales para la enseñanza interdisciplinar que incluye contenidos tradicionales de geografía humana y económica, así como de estadística. Incorporar los últimos avances de la investigación sobre los efectos en el aprendizaje de los nuevos medios basados en las tecnologías de la información y la comunicación. Plantear un ejemplo de desarrollo curricular de un conjunto de contenidos que no afectan a un tema único. Desarrollar actividades de enseñanza-aprendizaje centradas, en primer lugar, en la adquisicion de conceptos, principios y procedimientos climáticos; en segundo lugar, centradas en la toma de decisiones mediante un modelo de simulación probabilístico, conducentes a la adquisición de destrezas y habilidades intelectuales adecuadas al desarrollo cognitivo de alumnos de 14-16 años, aprovechando las ventajas que ofrece el ordenador.. Dos etapas, la primera diseña y elabora el programa de ordenador y la segunda se dirige a la evaluación del mismo. El diseño y producción del programa pasa por las siguientes fases: 1) selección, diseño y estructuración de las actividades de enseñanza-aprendizaje; 2) elaboración de la versión preliminar del software y de los materiales escritos complementarios; 3) revisión y adecuación de la versión preliminar; 4) elaboración de la versión definitiva del programa. Para la evaluación se diseña un modelo de software educativo y se pide a profesores en formación y en activo de enseñanza secundaria, de las áreas de geografía e historia y matemáticas, que evalúen el programa de su materia.. Programa de ordenador clima.. 1. El profesorado ha evaluado como muy útil el programa. 2. Cubre adecuadamente los objetivos de motivación, elicitación y reestructuración de ideas previas de los alumnos, y propone una secuencia correcta de actividades para el aprendizaje de conceptos, principios y procedimientos, tanto climáticos como probabilísticos. 3. Dentro de las categorías de software educativo, es clasificado a la vez, como programa tutorial, de simulación, de juego y de resolución de problemas, es decir, es adaptable a distintos tipos de profesores y situaciones didácticas. 4. Se valoran muy positivamente algunas características técnicas del programa: su interactividad, la claridad de las explicaciones y ayudas, la animación, los gráficos, el color y la utilización de hipertexto. 5. Se señalan varios defectos o dificultades técnicas: las elevadas exigencias al ordenador en cuanto a memoria, velocidad de procesamiento y resolución gráfica; la necesidad de utilización correcta por el usuario del entorno Windows. 6. Defectos didácticos: el programa no lleva un registro de nombres y actuaciones de los alumnos..

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La creciente dinamización de las IDE's genera una demanda de la construcción de Geoportales y por ende la demanda de herramientas que además de facilitar su construcción, configuración e implementación, ofrezcan la posibilidad de contratar un soporte técnico profesionalizado. OpenGeo Suite, paquete de software libre profesional e integrado, que permite desde el almacenamiento de datos geográficos, hasta su publicación utilizando estándares OGC e implementación de soluciones web GIS con librerías de código abierto Javascript. OpenGeo Suite permite un despliegue multiplataforma (Linux, Windows y OSX), con cuatro componentes de software libre fuertemente integrados basados en el uso de estándares OGC. Los componentes del lado del servidor están orientados al almacenamiento, configuración y publicación de datos por parte de usuarios técnicos en SIG: PostgreSQL+ la extensión espacial PostGIS que se encarga del almacenamiento de la información geográfica dando soporte a funciones de análisis espacial. pgAdmin como sistema de gestión de base de datos, facilitando la importación y actualización de datos. Geoserver se encarga de la publicación de la información geográfica proveniente de diferentes orígenes de datos: PostGIS, SHP, Oracle Spatial, GeoTIFF, etc. soportando la mayoría de estándares OGC de publicación de información geográfica WMS, WFS, WCS y de formatos GML, KML, GeoJSON, SLD. Además, ofrece soporte a cacheado de teselas a través de Geowebcache. OpenGeo Suite ofrece dos aplicaciones: GeoExplorer y GeoEditor, que permiten al técnico construir un Geoportal con capacidades de edición de geometrías.OpenGeo Suite ofrece una consola de administración (Dashboard) que facilita la configuración de los componentes de administración. Del lado del cliente, los componentes son librerías de desarrollo JavaScript orientadas a desarrolladores de aplicaciones Web SIG. OpenLayers con soporte para capas raster, vectoriales, estilos, proyecciones, teselado, herramientas de edición, etc. Por último, GeoExt para la construcción del front-end de Geoportales, basada en ExtJS y fuertemente acoplada a OpenLayers

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Much consideration is rightly given to the design of metadata models to describe data. At the other end of the data-delivery spectrum much thought has also been given to the design of geospatial delivery interfaces such as the Open Geospatial Consortium standards, Web Coverage Service (WCS), Web Map Server and Web Feature Service (WFS). Our recent experience with the Climate Science Modelling Language shows that an implementation gap exists where many challenges remain unsolved. To bridge this gap requires transposing information and data from one world view of geospatial climate data to another. Some of the issues include: the loss of information in mapping to a common information model, the need to create ‘views’ onto file-based storage, and the need to map onto an appropriate delivery interface (as with the choice between WFS and WCS for feature types with coverage-valued properties). Here we summarise the approaches we have taken in facing up to these problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new electronic software distribution (ESD) life cycle analysis (LCA)methodology and model structure were constructed to calculate energy consumption and greenhouse gas (GHG) emissions. In order to counteract the use of high level, top-down modeling efforts, and to increase result accuracy, a focus upon device details and data routes was taken. In order to compare ESD to a relevant physical distribution alternative,physical model boundaries and variables were described. The methodology was compiled from the analysis and operational data of a major online store which provides ESD and physical distribution options. The ESD method included the calculation of power consumption of data center server and networking devices. An in-depth method to calculate server efficiency and utilization was also included to account for virtualization and server efficiency features. Internet transfer power consumption was analyzed taking into account the number of data hops and networking devices used. The power consumed by online browsing and downloading was also factored into the model. The embedded CO2e of server and networking devices was proportioned to each ESD process. Three U.K.-based ESD scenarios were analyzed using the model which revealed potential CO2e savings of 83% when ESD was used over physical distribution. Results also highlighted the importance of server efficiency and utilization methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The FunFOLD2 server is a new independent server that integrates our novel protein–ligand binding site and quality assessment protocols for the prediction of protein function (FN) from sequence via structure. Our guiding principles were, first, to provide a simple unified resource to make our function prediction software easily accessible to all via a simple web interface and, second, to produce integrated output for predictions that can be easily interpreted. The server provides a clean web interface so that results can be viewed on a single page and interpreted by non-experts at a glance. The output for the prediction is an image of the top predicted tertiary structure annotated to indicate putative ligand-binding site residues. The results page also includes a list of the most likely binding site residues and the types of predicted ligands and their frequencies in similar structures. The protein–ligand interactions can also be interactively visualized in 3D using the Jmol plug-in. The raw machine readable data are provided for developers, which comply with the Critical Assessment of Techniques for Protein Structure Prediction data standards for FN predictions. The FunFOLD2 webserver is freely available to all at the following web site: http://www.reading.ac.uk/bioinf/FunFOLD/FunFOLD_form_2_0.html.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern database applications are increasingly employing database management systems (DBMS) to store multimedia and other complex data. To adequately support the queries required to retrieve these kinds of data, the DBMS need to answer similarity queries. However, the standard structured query language (SQL) does not provide effective support for such queries. This paper proposes an extension to SQL that seamlessly integrates syntactical constructions to express similarity predicates to the existing SQL syntax and describes the implementation of a similarity retrieval engine that allows posing similarity queries using the language extension in a relational DBM. The engine allows the evaluation of every aspect of the proposed extension, including the data definition language and data manipulation language statements, and employs metric access methods to accelerate the queries. Copyright (c) 2008 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are complex with a large variability of components, requirements and specification solutions need to be flexible and modular, independent from operating system and computer architecture. To describe and organize the information on all the components and the connections among them, systems are developed using the extensible Markup Language (XML) technology. The communication between clients and servers uses remote procedure call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows to develop easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration, and the Web application allows a simple graphical user interface (GUI) access. The TCABR tokamak team in collaboration with the IPFN (Instituto de Plasmas e Fusao Nuclear, Instituto Superior Tecnico, Universidade Tecnica de Lisboa) is implementing this remote participation technologies. The first version was tested at the Joint Experiment on TCABR (TCABRJE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on ""Joint Research Using Small Tokamaks"". (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabalho apresenta uma arquitetura para Ambientes de Desenvolvimento de Software (ADS). Esta arquitetura é baseada em produtos comerciais de prateleira (COTS), principalmente em um Sistema de Gerência de Workflow – SGW (Microsoft Exchange 2000 Server – E2K) - e tem como plataforma de funcionamento a Internet, integrando também algumas ferramentas que fazem parte do grande conjunto de aplicativos que é utilizado no processo de desenvolvimento de software. O desenvolvimento de um protótipo (WOSDIE – WOrkflow-based Software Development Integrated Environment) baseado na arquitetura apresentada é descrito em detalhes, mostrando as etapas de construção, funções implementadas e dispositivos necessários para a integração de um SGW, ferramentas de desenvolvimento, banco de dados (WSS – Web Storage System) e outros, para a construção de um ADS. O processo de software aplicado no WOSDIE foi extraído do RUP (Rational Unified Process – Processo Unificado Rational). Este processo foi modelado na ferramenta Workflow Designer, que permite a modelagem dos processos de workflow dentro do E2K. A ativação de ferramentas a partir de um navegador Web e o armazenamento dos artefatos produzidos em um projeto de software também são abordados. O E2K faz o monitoramento dos eventos que ocorrem dentro do ambiente WOSDIE, definindo, a partir das condições modeladas no Workflow Designer, quais atividades devem ser iniciadas após o término de alguma atividade anterior e quem é o responsável pela execução destas novas atividades (assinalamento de atividades). A arquitetura proposta e o protótipo WOSDIE são avaliados segundo alguns critérios retirados de vários trabalhos. Estas avaliações mostram em mais detalhes as características da arquitetura proposta e proporcionam uma descrição das vantagens e problemas associados ao WOSDIE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work discusses the environmental management thematic, on the basis of ISO 14001 standard and learning organization. This study is carried through an exploratory survey in a company of fuel transport, located in Natal/RN. The objective of this research was to investigate the practices of environmental management, carried through in the context of an implemented ISO 14001 environmental management system, in the researched organization, from the perspective of the learning organization. The methodology used in this work is supported in the quantitative method, combining the exploratory and descriptive types, and uses the technique of questionnaires, having as scope of the research, the managers, employee controlling, coordinators, supervisors and - proper and contracted - of the company. To carry through the analysis of the data of this research, it was used software Excel and Statistical version 6.0. The analysis of the data is divided in two parts: descriptive analysis and analysis of groupings (clusters). The results point, on the basis of the studied theory, as well as in the results of the research, that the implemented ISO 14001 environmental system in the searched organization presents elements that promote learning organization. From the results, it can be concluded that the company uses external information in the decision taking on environmental problems; that the employees are mobilized to generate ideas and to collect n environmental information and that the company has carried through partnerships in the activities of the environmental area with other companies. All these item cited can contribute for the generation of knowledge of the organization. It can also be concluded that the company has evaluated environmental errors occurrences in the past, as well as carried through environmental benchmarking. These practical can be considered as good ways of the company to acquire knowledge. The results also show that the employees have not found difficulties in the accomplishment of the tasks when the manager of its sector is not present. This result can demonstrate that the company has a good diffusion of knowledge

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the current need of the industry to integrate data of the beginning of production originating from of several sources and of transforming them in useful information for sockets of decisions, a search exists every time larger for systems of visualization of information that come to collaborate with that functionality. On the other hand, a common practice nowadays, due to the high competitiveness of the market, it is the development of industrial systems that possess characteristics of modularity, distribution, flexibility, scalability, adaptation, interoperability, reusability and access through web. Those characteristics provide an extra agility and a larger easiness in adapting to the frequent changes of demand of the market. Based on the arguments exposed above, this work consists of specifying a component-based architecture, with the respective development of a system based on that architecture, for the visualization of industrial data. The system was conceived to be capable to supply on-line information and, optionally, historical information of variables originating from of the beginning of production. In this work it is shown that the component-based architecture developed possesses the necessary requirements for the obtaining of a system robust, reliable and of easy maintenance, being, like this, in agreement with the industrial needs. The use of that architecture allows although components can be added, removed or updated in time of execution, through a manager of components through web, still activating more the adaptation process and updating of the system

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work a software developed in the Instituto de Fisica Gleb Wataghin, IFGW, UNICAMP, Campinas, SP, Brazil for obtaining thermal histories using apatite fission track analysis is presented. This software works in Microsoft-Windows environment. It will be freely disposable in the web site of the Departamento de Raios Cosmicos, IFGW, UNICAMP. Thermal histories obtained through this software are compared with those deduced using Monte Trax the software compatible with Apple Macintosh developed by Gallagher. (C) 2001 Elsevier B.V. Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To compare simulated periodontal bone defect depth measured in digital radiographs with dedicated and non-dedicated software systems and to compare the depth measurements from each program with the measurements in dry mandibles.Methods: Forty periodontal bone defects were created at the proximal area of the first premolar in dry pig mandibles. Measurements of the defects were performed with a periodontal probe in the dry mandible. Periapical digital radiographs of the defects were recorded using the Schick sensor in a standardized exposure setting. All images were read using a Schick dedicated software system (CDR DICOM for Windows v.3.5), and three commonly available non-dedicated software systems (Vix Win 2000 v.1.2; Adobe Photoshop 7.0 and Image Tool 3.0). The defects were measured three times in each image and a consensus was reached among three examiners using the four software systems. The difference between the radiographic measurements was analysed using analysis of variance (ANOVA) and by comparing the measurements from each software system with the dry mandibles measurements using Student's t-test.Results: the mean values of the bone defects measured in the radiographs were 5.07 rum, 5.06 rum, 5.01 mm and 5.11 mm for CDR Digital Image and Communication in Medicine (DICOM) for Windows, Vix Win, Adobe Photoshop, and Image Tool, respectively, and 6.67 mm for the dry mandible. The means of the measurements performed in the four software systems were not significantly different, ANOVA (P = 0.958). A significant underestimation of defect depth was obtained when we compared the mean depths from each software system with the dry mandible measurements (t-test; P congruent to 0.000).Conclusions: the periodontal bone defect measurements in dedicated and in three non-dedicated software systems were not significantly different, but they all underestimated the measurements when compared with the measurements obtained in the dry mandibles.