880 resultados para Building information modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This tutorial is intended to be a "quick start" to creating simulations with GENESIS. It should give you the tools and enough information to let you quickly begin creating cells and networks with GENESIS, making use of the provided example simulations. Advanced topics are covered by appropriate links to the Advanced Tutorials on Realistic Neural Modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

National and international studies demonstrate that the number of teenagers using the inter-net increases. But even though they actually do have access from different places to the in-formation and communication pool of the internet, there is evidence that the ways in which teenagers use the net - regarding the scope and frequency in which services are used as well as the preferences for different contents of these services - differ significantly in relation to socio-economic status, education, and gender. The results of the regarding empirical studies may be summarised as such: teenager with low (formal ) education especially use internet services embracing 'entertainment, play and fun' while higher educated teenagers (also) prefer intellectually more demanding and particularly services supplying a greater variety of communicative and informative activities. More generally, pedagogical and sociological studies investigating "digital divide" in a dif-ferentiated and sophisticated way - i.e. not only in terms of differences between those who do have access to the Internet and those who do not - suggest that the internet is no space beyond 'social reality' (e.g. DiMaggio & Hargittai 2001, 2003; Vogelgesang, 2002; Welling, 2003). Different modes of utilisation, that structure the internet as a social space are primarily a specific contextualisation of the latter - and thus, the opportunities and constraints in virtual world of the internet are not less than those in the 'real world' related to unequal distribu-tions of material, social and cultural resources as well as social embeddings of the actors involved. This fact of inequality is also true regarding the outcomes of using the internet. Empirical and theoretical results concerning forms and processes of networking and commu-nity building - i.e. sociability in the internet, as well as the social embeddings of the users which are mediated through the internet - suggest that net based communication and infor-mation processes may entail the resource 'social support'. Thus, with reference to social work and the task of compensating the reproduction of social disadvantages - whether they are medial or not - the ways in which teenagers get access to and utilize net based social sup-port are to be analysed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Under the brand name “sciebo – the Campuscloud” (derived from “science box”) a consortium of more than 20 research and applied science universities started a large scale cloud service for about 500,000 students and researchers in North Rhine-Westphalia, Germany’s most populous state. Starting with the much anticipated data privacy compliant sync & share functionality, sciebo offers the potential to become a more general cloud platform for collaboration and research data management which will be actively pursued in upcoming scientific and infrastructural projects. This project report describes the formation of the venture, its targets and the technical and the legal solution as well as the current status and the next steps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By means of fixed-links modeling, the present study identified different processes of visual short-term memory (VSTM) functioning and investigated how these processes are related to intelligence. We conducted an experiment where the participants were presented with a color change detection task. Task complexity was manipulated through varying the number of presented stimuli (set size). We collected hit rate and reaction time (RT) as indicators for the amount of information retained in VSTM and speed of VSTM scanning, respectively. Due to the impurity of these measures, however, the variability in hit rate and RT was assumed to consist not only of genuine variance due to individual differences in VSTM retention and VSTM scanning but also of other, non-experimental portions of variance. Therefore, we identified two qualitatively different types of components for both hit rate and RT: (1) non-experimental components representing processes that remained constant irrespective of set size and (2) experimental components reflecting processes that increased as a function of set size. For RT, intelligence was negatively associated with the non-experimental components, but was unrelated to the experimental components assumed to represent variability in VSTM scanning speed. This finding indicates that individual differences in basic processing speed, rather than in speed of VSTM scanning, differentiates between high- and low-intelligent individuals. For hit rate, the experimental component constituting individual differences in VSTM retention was positively related to intelligence. The non-experimental components of hit rate, representing variability in basal processes, however, were not associated with intelligence. By decomposing VSTM functioning into non-experimental and experimental components, significant associations with intelligence were revealed that otherwise might have been obscured.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Genesis mission Solar Wind Concentrator was built to enhance fluences of solar wind by an average of 20x over the 2.3 years that the mission exposed substrates to the solar wind. The Concentrator targets survived the hard landing upon return to Earth and were used to determine the isotopic composition of solar-wind—and hence solar—oxygen and nitrogen. Here we report on the flight operation of the instrument and on simulations of its performance. Concentration and fractionation patterns obtained from simulations are given for He, Li, N, O, Ne, Mg, Si, S, and Ar in SiC targets, and are compared with measured concentrations and isotope ratios for the noble gases. Carbon is also modeled for a Si target. Predicted differences in instrumental fractionation between elements are discussed. Additionally, as the Concentrator was designed only for ions ≤22 AMU, implications of analyzing elements as heavy as argon are discussed. Post-flight simulations of instrumental fractionation as a function of radial position on the targets incorporate solar-wind velocity and angular distributions measured in flight, and predict fractionation patterns for various elements and isotopes of interest. A tighter angular distribution, mostly due to better spacecraft spin stability than assumed in pre-flight modeling, results in a steeper isotopic fractionation gradient between the center and the perimeter of the targets. Using the distribution of solar-wind velocities encountered during flight, which are higher than those used in pre-flight modeling, results in elemental abundance patterns slightly less peaked at the center. Mean fractionations trend with atomic mass, with differences relative to the measured isotopes of neon of +4.1±0.9 ‰/amu for Li, between -0.4 and +2.8 ‰/amu for C, +1.9±0.7‰/amu for N, +1.3±0.4 ‰/amu for O, -7.5±0.4 ‰/amu for Mg, -8.9±0.6 ‰/amu for Si, and -22.0±0.7 ‰/amu for S (uncertainties reflect Monte Carlo statistics). The slopes of the fractionation trends depend to first order only on the relative differential mass ratio, Δ m/ m. This article and a companion paper (Reisenfeld et al. 2012, this issue) provide post-flight information necessary for the analysis of the Genesis solar wind samples, and thus serve to complement the Space Science Review volume, The Genesis Mission (v. 105, 2003).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With improving clinical CT scanning technology, the accuracy of CT-based finite element (FE) models of the human skeleton may be ameliorated by an enhanced description of apparent level bone mechanical properties. Micro-finite element (μFE) modeling can be used to study the apparent elastic behavior of human cancellous bone. In this study, samples from the femur, radius and vertebral body were investigated to evaluate the predictive power of morphology–elasticity relationships and to compare them across different anatomical regions. μFE models of 701 trabecular bone cubes with a side length of 5.3 mm were analyzed using kinematic boundary conditions. Based on the FE results, four morphology–elasticity models using bone volume fraction as well as full, limited or no fabric information were calibrated for each anatomical region. The 5 parameter Zysset–Curnier model using full fabric information showed excellent predictive power with coefficients of determination ( r2adj ) of 0.98, 0.95 and 0.94 of the femur, radius and vertebra data, respectively, with mean total norm errors between 14 and 20%. A constant orthotropy model and a constant transverse isotropy model, where the elastic anisotropy is defined by the model parameters, yielded coefficients of determination between 0.90 and 0.98 with total norm errors between 16 and 25%. Neglecting fabric information and using an isotropic model led to r2adj between 0.73 and 0.92 with total norm errors between 38 and 49%. A comparison of the model regressions revealed minor but significant (p<0.01) differences for the fabric–elasticity model parameters calibrated for the different anatomical regions. The proposed models and identified parameters can be used in future studies to compute the apparent elastic properties of human cancellous bone for homogenized FE models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A general introduction to the state of the art in modeling metal organic materials using transferable atomic multipoles is provided. The method is based on the building block partitioning of the electron density, which is illustrated with some examples of potential applications and with detailed discussions of the advantages and pitfalls. The interactions taking place between building blocks are summarized and are used to discuss the properties that can be calculated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to analyze software systems, it is necessary to model them. Static software models are commonly imported by parsing source code and related data. Unfortunately, building custom parsers for most programming languages is a non-trivial endeavour. This poses a major bottleneck for analyzing software systems programmed in languages for which importers do not already exist. Luckily, initial software models do not require detailed parsers, so it is possible to start analysis with a coarse-grained importer, which is then gradually refined. In this paper we propose an approach to "agile modeling" that exploits island grammars to extract initial coarse-grained models, parser combinators to enable gradual refinement of model importers, and various heuristics to recognize language structure, keywords and other language artifacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recognizing the increasing amount of information shared on Social Networking Sites (SNS), in this study we aim to explore the information processing strategies of users on Facebook. Specifically, we aim to investigate the impact of various factors on user attitudes towards the posts on their Newsfeed. To collect the data, we program a Facebook application that allows users to evaluate posts in real time. Applying Structural Equation Modeling to a sample of 857 observations we find that it is mostly the affective attitude that shapes user behavior on the network. This attitude, in turn, is mainly determined by the communication intensity between users, overriding comprehensibility of the post and almost neglecting post length and user posting frequency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite a broad range of collaboration tools already available, enterprises continue to look for ways to improve internal and external communication. Microblogging is such a new communication channel with some considerable potential to improve intra-firm transparency and knowledge sharing. However, the adoption of such social software presents certain challenges to enterprises. Based on the results of four focus group sessions, we identified several new constructs to play an important role in the microblogging adoption decision. Examples include privacy concerns, communication benefits, perceptions regarding signal-to-noise ratio, as well codification effort. Integrating these findings with common views on technology acceptance, we formulate a model to predict the adoption of a microblogging system in the workspace. Our findings serve as an important guideline for managers seeking to realize the potential of microblogging in their company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Empirical evidence and theoretical studies suggest that the phenotype, i.e., cellular- and molecular-scale dynamics, including proliferation rate and adhesiveness due to microenvironmental factors and gene expression that govern tumor growth and invasiveness, also determine gross tumor-scale morphology. It has been difficult to quantify the relative effect of these links on disease progression and prognosis using conventional clinical and experimental methods and observables. As a result, successful individualized treatment of highly malignant and invasive cancers, such as glioblastoma, via surgical resection and chemotherapy cannot be offered and outcomes are generally poor. What is needed is a deterministic, quantifiable method to enable understanding of the connections between phenotype and tumor morphology. Here, we critically assess advantages and disadvantages of recent computational modeling efforts (e.g., continuum, discrete, and cellular automata models) that have pursued this understanding. Based on this assessment, we review a multiscale, i.e., from the molecular to the gross tumor scale, mathematical and computational "first-principle" approach based on mass conservation and other physical laws, such as employed in reaction-diffusion systems. Model variables describe known characteristics of tumor behavior, and parameters and functional relationships across scales are informed from in vitro, in vivo and ex vivo biology. We review the feasibility of this methodology that, once coupled to tumor imaging and tumor biopsy or cell culture data, should enable prediction of tumor growth and therapy outcome through quantification of the relation between the underlying dynamics and morphological characteristics. In particular, morphologic stability analysis of this mathematical model reveals that tumor cell patterning at the tumor-host interface is regulated by cell proliferation, adhesion and other phenotypic characteristics: histopathology information of tumor boundary can be inputted to the mathematical model and used as a phenotype-diagnostic tool to predict collective and individual tumor cell invasion of surrounding tissue. This approach further provides a means to deterministically test effects of novel and hypothetical therapy strategies on tumor behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Family preservation has been criticized for implementing programs that are not theoretically founded. One result of this circumstance is a lack of information regarding processes and outcomes of family preservation services. The knowledge base of family preservation is thus rather limited at present and will remain limited unless theory is consistently integrated within individual programs. A model for conceptualizing how theoretical consistency may be implemented within programs is presented and applied to family preservation. It is also necessary for programs to establish theoretical consistency before theoretical diversity, both within individual and across multiple programs, in order to advance the field in meaningful ways. A developmental cycle of knowledge generation is presented and applied to family preservation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When tilted sideways participants misperceive the visual vertical assessed by means of a luminous line in otherwise complete dark- ness. A recent modeling approach (De Vrijer et al., 2009) claimed that these typical patterns of errors (known as A- and E-effects) could be explained by as- suming that participants behave in a Bayes optimal manner. In this study, we experimentally manipulate participants’ prior information about body-in-space orientation and measure the effect of this manipulation on the subjective visual vertical (SVV). Specifically, we explore the effects of veridical and misleading instructions about body tilt orientations on the SVV. We used a psychophys- ical 2AFC SVV task at roll tilt angles of 0 degrees, 16 degrees and 4 degrees CW and CCW. Participants were tilted to 4 degrees under different instruction conditions: in one condition, participants received veridical instructions as to their tilt angle, whereas in another condition, participants received the mis- leading instruction that their body position was perfectly upright. Our results indicate systematic differences between the instruction conditions at 4 degrees CW and CCW. Participants did not simply use an ego-centric reference frame in the misleading condition; instead, participants’ estimates of the SVV seem to lie between their head’s Z-axis and the estimate of the SVV as measured in the veridical condition. All participants displayed A-effects at roll tilt an- gles of 16 degrees CW and CCW. We discuss our results in the context of the Bayesian model by De Vrijer et al. (2009), and claim that this pattern of re- sults is consistent with a manipulation of precision of a prior distribution over body-in-space orientations. Furthermore, we introduce a Bayesian Generalized Linear Model for estimating parameters of participants’ psychometric function, which allows us to jointly estimate group level and individual level parameters under all experimental conditions simultaneously, rather than relying on the traditional two-step approach to obtaining group level parameter estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contraction, strike slip, and extension displacements along the Hikurangi margin northeast of the North Island of New Zealand coincide with large lateral gradients in material properties. We use a finite- difference code utilizing elastic and elastic-plastic rheologies to build large- scale, three-dimensional numerical models which investigate the influence of material properties on velocity partitioning within oblique subduction zones. Rheological variation in the oblique models is constrained by seismic velocity and attenuation information available for the Hikurangi margin. We compare the effect of weakly versus strongly coupled subduction interfaces on the development of extension and the partitioning of velocity components for orthogonal and oblique convergence and include the effect of ponded sediments beneath the Raukumara Peninsula. Extension and velocity partitioning occur if the subduction interface is weak, but neither develops if the subduction interface is strong. The simple mechanical model incorporating rheological variation based on seismic observations produces kinematics that closely match those published from the Hikurangi margin. These include extension within the Taupo Volcanic Zone, uplift over ponded sediments, and dextral contraction to the south.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.