14 resultados para semi binary based feature detectordescriptor
em Helda - Digital Repository of University of Helsinki
Resumo:
In this dissertation, I present an overall methodological framework for studying linguistic alternations, focusing specifically on lexical variation in denoting a single meaning, that is, synonymy. As the practical example, I employ the synonymous set of the four most common Finnish verbs denoting THINK, namely ajatella, miettiä, pohtia and harkita ‘think, reflect, ponder, consider’. As a continuation to previous work, I describe in considerable detail the extension of statistical methods from dichotomous linguistic settings (e.g., Gries 2003; Bresnan et al. 2007) to polytomous ones, that is, concerning more than two possible alternative outcomes. The applied statistical methods are arranged into a succession of stages with increasing complexity, proceeding from univariate via bivariate to multivariate techniques in the end. As the central multivariate method, I argue for the use of polytomous logistic regression and demonstrate its practical implementation to the studied phenomenon, thus extending the work by Bresnan et al. (2007), who applied simple (binary) logistic regression to a dichotomous structural alternation in English. The results of the various statistical analyses confirm that a wide range of contextual features across different categories are indeed associated with the use and selection of the selected think lexemes; however, a substantial part of these features are not exemplified in current Finnish lexicographical descriptions. The multivariate analysis results indicate that the semantic classifications of syntactic argument types are on the average the most distinctive feature category, followed by overall semantic characterizations of the verb chains, and then syntactic argument types alone, with morphological features pertaining to the verb chain and extra-linguistic features relegated to the last position. In terms of overall performance of the multivariate analysis and modeling, the prediction accuracy seems to reach a ceiling at a Recall rate of roughly two-thirds of the sentences in the research corpus. The analysis of these results suggests a limit to what can be explained and determined within the immediate sentential context and applying the conventional descriptive and analytical apparatus based on currently available linguistic theories and models. The results also support Bresnan’s (2007) and others’ (e.g., Bod et al. 2003) probabilistic view of the relationship between linguistic usage and the underlying linguistic system, in which only a minority of linguistic choices are categorical, given the known context – represented as a feature cluster – that can be analytically grasped and identified. Instead, most contexts exhibit degrees of variation as to their outcomes, resulting in proportionate choices over longer stretches of usage in texts or speech.
Resumo:
The number of drug substances in formulation development in the pharmaceutical industry is increasing. Some of these are amorphous drugs and have glass transition below ambient temperature, and thus they are usually difficult to formulate and handle. One reason for this is the reduced viscosity, related to the stickiness of the drug, that makes them complicated to handle in unit operations. Thus, the aim in this thesis was to develop a new processing method for a sticky amorphous model material. Furthermore, model materials were characterised before and after formulation, using several characterisation methods, to understand more precisely the prerequisites for physical stability of amorphous state against crystallisation. The model materials used were monoclinic paracetamol and citric acid anhydrate. Amorphous materials were prepared by melt quenching or by ethanol evaporation methods. The melt blends were found to have slightly higher viscosity than the ethanol evaporated materials. However, melt produced materials crystallised more easily upon consecutive shearing than ethanol evaporated materials. The only material that did not crystallise during shearing was a 50/50 (w/w, %) blend regardless of the preparation method and it was physically stable at least two years in dry conditions. Shearing at varying temperatures was established to measure the physical stability of amorphous materials in processing and storage conditions. The actual physical stability of the blends was better than the pure amorphous materials at ambient temperature. Molecular mobility was not related to the physical stability of the amorphous blends, observed as crystallisation. Molecular mobility of the 50/50 blend derived from a spectral linewidth as a function of temperature using solid state NMR correlated better with the molecular mobility derived from a rheometer than that of differential scanning calorimetry data. Based on the results obtained, the effect of molecular interactions, thermodynamic driving force and miscibility of the blends are discussed as the key factors to stabilise the blends. The stickiness was found to be affected glass transition and viscosity. Ultrasound extrusion and cutting were successfully tested to increase the processability of sticky material. Furthermore, it was found to be possible to process the physically stable 50/50 blend in a supercooled liquid state instead of a glassy state. The method was not found to accelerate the crystallisation. This may open up new possibilities to process amorphous materials that are otherwise impossible to manufacture into solid dosage forms.
Resumo:
The earliest stages of human cortical visual processing can be conceived as extraction of local stimulus features. However, more complex visual functions, such as object recognition, require integration of multiple features. Recently, neural processes underlying feature integration in the visual system have been under intensive study. A specialized mid-level stage preceding the object recognition stage has been proposed to account for the processing of contours, surfaces and shapes as well as configuration. This thesis consists of four experimental, psychophysical studies on human visual feature integration. In two studies, classification image a recently developed psychophysical reverse correlation method was used. In this method visual noise is added to near-threshold stimuli. By investigating the relationship between random features in the noise and observer s perceptual decision in each trial, it is possible to estimate what features of the stimuli are critical for the task. The method allows visualizing the critical features that are used in a psychophysical task directly as a spatial correlation map, yielding an effective "behavioral receptive field". Visual context is known to modulate the perception of stimulus features. Some of these interactions are quite complex, and it is not known whether they reflect early or late stages of perceptual processing. The first study investigated the mechanisms of collinear facilitation, where nearby collinear Gabor flankers increase the detectability of a central Gabor. The behavioral receptive field of the mechanism mediating the detection of the central Gabor stimulus was measured by the classification image method. The results show that collinear flankers increase the extent of the behavioral receptive field for the central Gabor, in the direction of the flankers. The increased sensitivity at the ends of the receptive field suggests a low-level explanation for the facilitation. The second study investigated how visual features are integrated into percepts of surface brightness. A novel variant of the classification image method with brightness matching task was used. Many theories assume that perceived brightness is based on the analysis of luminance border features. Here, for the first time this assumption was directly tested. The classification images show that the perceived brightness of both an illusory Craik-O Brien-Cornsweet stimulus and a real uniform step stimulus depends solely on the border. Moreover, the spatial tuning of the features remains almost constant when the stimulus size is changed, suggesting that brightness perception is based on the output of a single spatial frequency channel. The third and fourth studies investigated global form integration in random-dot Glass patterns. In these patterns, a global form can be immediately perceived, if even a small proportion of random dots are paired to dipoles according to a geometrical rule. In the third study the discrimination of orientation structure in highly coherent concentric and Cartesian (straight) Glass patterns was measured. The results showed that the global form was more efficiently discriminated in concentric patterns. The fourth study investigated how form detectability depends on the global regularity of the Glass pattern. The local structure was either Cartesian or curved. It was shown that randomizing the local orientation deteriorated the performance only with the curved pattern. The results give support for the idea that curved and Cartesian patterns are processed in at least partially separate neural systems.
Resumo:
There exists various suggestions for building a functional and a fault-tolerant large-scale quantum computer. Topological quantum computation is a more exotic suggestion, which makes use of the properties of quasiparticles manifest only in certain two-dimensional systems. These so called anyons exhibit topological degrees of freedom, which, in principle, can be used to execute quantum computation with intrinsic fault-tolerance. This feature is the main incentive to study topological quantum computation. The objective of this thesis is to provide an accessible introduction to the theory. In this thesis one has considered the theory of anyons arising in two-dimensional quantum mechanical systems, which are described by gauge theories based on so called quantum double symmetries. The quasiparticles are shown to exhibit interactions and carry quantum numbers, which are both of topological nature. Particularly, it is found that the addition of the quantum numbers is not unique, but that the fusion of the quasiparticles is described by a non-trivial fusion algebra. It is discussed how this property can be used to encode quantum information in a manner which is intrinsically protected from decoherence and how one could, in principle, perform quantum computation by braiding the quasiparticles. As an example of the presented general discussion, the particle spectrum and the fusion algebra of an anyon model based on the gauge group S_3 are explicitly derived. The fusion algebra is found to branch into multiple proper subalgebras and the simplest one of them is chosen as a model for an illustrative demonstration. The different steps of a topological quantum computation are outlined and the computational power of the model is assessed. It turns out that the chosen model is not universal for quantum computation. However, because the objective was a demonstration of the theory with explicit calculations, none of the other more complicated fusion subalgebras were considered. Studying their applicability for quantum computation could be a topic of further research.
Resumo:
Comprehensive two-dimensional gas chromatography (GC×GC) offers enhanced separation efficiency, reliability in qualitative and quantitative analysis, capability to detect low quantities, and information on the whole sample and its components. These features are essential in the analysis of complex samples, in which the number of compounds may be large or the analytes of interest are present at trace level. This study involved the development of instrumentation, data analysis programs and methodologies for GC×GC and their application in studies on qualitative and quantitative aspects of GC×GC analysis. Environmental samples were used as model samples. Instrumental development comprised the construction of three versions of a semi-rotating cryogenic modulator in which modulation was based on two-step cryogenic trapping with continuously flowing carbon dioxide as coolant. Two-step trapping was achieved by rotating the nozzle spraying the carbon dioxide with a motor. The fastest rotation and highest modulation frequency were achieved with a permanent magnetic motor, and modulation was most accurate when the motor was controlled with a microcontroller containing a quartz crystal. Heated wire resistors were unnecessary for the desorption step when liquid carbon dioxide was used as coolant. With use of the modulators developed in this study, the narrowest peaks were 75 ms at base. Three data analysis programs were developed allowing basic, comparison and identification operations. Basic operations enabled the visualisation of two-dimensional plots and the determination of retention times, peak heights and volumes. The overlaying feature in the comparison program allowed easy comparison of 2D plots. An automated identification procedure based on mass spectra and retention parameters allowed the qualitative analysis of data obtained by GC×GC and time-of-flight mass spectrometry. In the methodological development, sample preparation (extraction and clean-up) and GC×GC methods were developed for the analysis of atmospheric aerosol and sediment samples. Dynamic sonication assisted extraction was well suited for atmospheric aerosols collected on a filter. A clean-up procedure utilising normal phase liquid chromatography with ultra violet detection worked well in the removal of aliphatic hydrocarbons from a sediment extract. GC×GC with flame ionisation detection or quadrupole mass spectrometry provided good reliability in the qualitative analysis of target analytes. However, GC×GC with time-of-flight mass spectrometry was needed in the analysis of unknowns. The automated identification procedure that was developed was efficient in the analysis of large data files, but manual search and analyst knowledge are invaluable as well. Quantitative analysis was examined in terms of calibration procedures and the effect of matrix compounds on GC×GC separation. In addition to calibration in GC×GC with summed peak areas or peak volumes, simplified area calibration based on normal GC signal can be used to quantify compounds in samples analysed by GC×GC so long as certain qualitative and quantitative prerequisites are met. In a study of the effect of matrix compounds on GC×GC separation, it was shown that quality of the separation of PAHs is not significantly disturbed by the amount of matrix and quantitativeness suffers only slightly in the presence of matrix and when the amount of target compounds is low. The benefits of GC×GC in the analysis of complex samples easily overcome some minor drawbacks of the technique. The developed instrumentation and methodologies performed well for environmental samples, but they could also be applied for other complex samples.
Resumo:
This study Contested Lands: Land disputes in semi-arid parts of northern Tanzania. Case Studies of the Loliondo and Sale Division in the Ngorongoro District concentrates on describing the specific land disputes which took place in the 1990s in the Loliondo and Sale Divisions of the Ngorongoro District in northern Tanzania. The study shows the territorial and historical transformation of territories and property and their relation to the land disputes of the 1990s'. It was assumed that land disputes have been firstly linked to changing spatiality due to the zoning policies of the State territoriality and, secondly, they can be related to the State control of property where the ownership of land property has been redefined through statutory laws. In the analysis of the land disputes issues such as use of territoriality, boundary construction and property claims, in geographical space, are highlighted. Generally, from the 1980s onwards, increases in human population within both Divisions have put pressure on land/resources. This has led to the increased control of land/resource, to the construction of boundaries and finally to formalized land rights on village lands of the Loliondo Division. The land disputes have thus been linked to the use of legal power and to the re-creation of the boundary (informal or formal) either by the Maasai or the Sonjo on the Loliondo and Sale village lands. In Loliondo Division land disputes have been resource-based and related to multiple allocations of land or game resource concessions. Land disputes became clearly political and legal struggles with an ecological reference.Land disputes were stimulated when the common land/resource rights on village lands of the Maasai pastoralists became regulated and insecure. The analysis of past land disputes showed that space-place tensions on village lands can be presented as a platform on which spatial and property issues with complex power relations have been debated. The reduction of future land disputes will succeed only when/if local property rights to land and resources are acknowledged, especially in rural lands of the Tanzanian State.
Resumo:
Event-based systems are seen as good candidates for supporting distributed applications in dynamic and ubiquitous environments because they support decoupled and asynchronous many-to-many information dissemination. Event systems are widely used, because asynchronous messaging provides a flexible alternative to RPC (Remote Procedure Call). They are typically implemented using an overlay network of routers. A content-based router forwards event messages based on filters that are installed by subscribers and other routers. The filters are organized into a routing table in order to forward incoming events to proper subscribers and neighbouring routers. This thesis addresses the optimization of content-based routing tables organized using the covering relation and presents novel data structures and configurations for improving local and distributed operation. Data structures are needed for organizing filters into a routing table that supports efficient matching and runtime operation. We present novel results on dynamic filter merging and the integration of filter merging with content-based routing tables. In addition, the thesis examines the cost of client mobility using different protocols and routing topologies. We also present a new matching technique called temporal subspace matching. The technique combines two new features. The first feature, temporal operation, supports notifications, or content profiles, that persist in time. The second feature, subspace matching, allows more expressive semantics, because notifications may contain intervals and be defined as subspaces of the content space. We also present an application of temporal subspace matching pertaining to metadata-based continuous collection and object tracking.
Resumo:
Semi-natural grasslands are the most important agricultural areas for biodiversity. The present study investigates the effects of traditional livestock grazing and mowing on plant species richness, the main emphasis being on cattle grazing in mesic semi-natural grasslands. The two reviews provide a thorough assessment of the multifaceted impacts and importance of grazing and mowing management to plant species richness. It is emphasized that livestock grazing and mowing have partially compensated the suppression of major natural disturbances by humans and mitigated the negative effects of eutrophication. This hypothesis has important consequences for nature conservation: A large proportion of European species originally adapted to natural disturbances may be at present dependent on livestock grazing and / or mowing. Furthermore, grazing and mowing are key management methods to mitigate effects of nutrient-enrichment. The species composition and richness in old (continuously grazed), new (grazing restarting 3-8 years ago) and abandoned (over 10 years) pastures differed consistently across a range of spatial scales, and was intermediate in new pastures compared to old and abandoned pastures. In mesic grasslands most plant species were shown to benefit from cattle grazing. Indicator species of biologically valuable grasslands and rare species were more abundant in grazed than in abandoned grasslands. Steep S-SW-facing slopes are the most suitable sites for many grassland plants and should be prioritized in grassland restoration. The proportion of species trait groups benefiting from grazing was higher in mesic semi-natural grasslands than in dry and wet grasslands. Consequently, species trait responses to grazing and the effectiveness of the natural factors limiting plant growth may be intimately linked High plant species richness of traditionally mowed and grazed areas is explained by numerous factors which operate on different spatial scales. Particularly important for maintaining large scale plant species richness are evolutionary and mitigation factors. Grazing and mowing cause a shift towards the conditions that have occurred during the evolutionary history of European plant species by modifying key ecological factors (nutrients, pH and light). The results of this Dissertation suggest that restoration of semi-natural grasslands by private farmers is potentially a useful method to manage biodiversity in the agricultural landscape. However, the quality of management is commonly improper, particularly due to financial constraints. For enhanced success of restoration, management regulations in the agri-environment scheme need to be defined more explicitly and the scheme should be revised to encourage management of biodiversity.
Resumo:
One major reason for the global decline of biodiversity is habitat loss and fragmentation. Conservation areas can be designed to reduce biodiversity loss, but as resources are limited, conservation efforts need to be prioritized in order to achieve best possible outcomes. The field of systematic conservation planning developed as a response to opportunistic approaches to conservation that often resulted in biased representation of biological diversity. The last two decades have seen the development of increasingly sophisticated methods that account for information about biodiversity conservation goals (benefits), economical considerations (costs) and socio-political constraints. In this thesis I focus on two general topics related to systematic conservation planning. First, I address two aspects of the question about how biodiversity features should be valued. (i) I investigate the extremely important but often neglected issue of differential prioritization of species for conservation. Species prioritization can be based on various criteria, and is always goal-dependent, but can also be implemented in a scientifically more rigorous way than what is the usual practice. (ii) I introduce a novel framework for conservation prioritization, which is based on continuous benefit functions that convert increasing levels of biodiversity feature representation to increasing conservation value using the principle that more is better. Traditional target-based systematic conservation planning is a special case of this approach, in which a step function is used for the benefit function. We have further expanded the benefit function framework for area prioritization to address issues such as protected area size and habitat vulnerability. In the second part of the thesis I address the application of community level modelling strategies to conservation prioritization. One of the most serious issues in systematic conservation planning currently is not the deficiency of methodology for selection and design, but simply the lack of data. Community level modelling offers a surrogate strategy that makes conservation planning more feasible in data poor regions. We have reviewed the available community-level approaches to conservation planning. These range from simplistic classification techniques to sophisticated modelling and selection strategies. We have also developed a general and novel community level approach to conservation prioritization that significantly improves on methods that were available before. This thesis introduces further degrees of realism into conservation planning methodology. The benefit function -based conservation prioritization framework largely circumvents the problematic phase of target setting, and allowing for trade-offs between species representation provides a more flexible and hopefully more attractive approach to conservation practitioners. The community-level approach seems highly promising and should prove valuable for conservation planning especially in data poor regions. Future work should focus on integrating prioritization methods to deal with multiple aspects in combination influencing the prioritization process, and further testing and refining the community level strategies using real, large datasets.
Resumo:
This work focuses on the factors affecting species richness, abundance and species composition of butterflies and moths in Finnish semi-natural grasslands, with a special interest in the effects of grazing management. In addition, an aim was set at evaluating the effectiveness of the support for livestock grazing in semi-natural grasslands, which is included in the Finnish agri-environment scheme. In the first field study, butterfly and moth communities in resumed semi-natural pastures were com-pared to old, annually grazed and abandoned previous pastures. Butterfly and moth species compo-sition in restored pastures resembled the compositions observed in old pastures after circa five years of resumed cattle grazing, but diversity of butterflies and moths in resumed pastures remained at a lower level compared with old pastures. None of the butterfly and moth species typical of old pas-tures had become more abundant in restored pastures compared with abandoned pastures. There-fore, it appears that restoration of butterfly and moth communities inhabiting semi-natural grass-lands requires a longer time that was available for monitoring in this study. In the second study, it was shown that local habitat quality has the largest impact on the occurrence and abundance of butterflies and moths compared to the effects of grassland patch area and connec-tivity of the regional grassland network. This emphasizes the importance of current and historical management of semi-natural grasslands on butterfly and moth communities. A positive effect of habitat connectivity was observed on total abundance of the declining butterflies and moths, sug-gesting that these species have strongest populations in well-connected habitat networks. Highest species richness and peak abundance of most individual species of butterflies and moths were generally observed in taller grassland vegetation compared with vascular plants, suggesting a preference towards less intensive management in insects. These differences between plants and their insect herbivores may be understood in the light of both (1) the higher structural diversity of tall vegetation and (2) weaker tolerance of disturbances by herbivorous insects due to their higher trophic level compared to plants. The ecological requirements of all species and species groups inhabiting semi-natural grasslands are probably never met at single restricted sites. Therefore, regional implementation of management to create differently managed areas is imperative for the conservation of different species and species groups dependent on semi-natural grasslands. With limited resources it might be reasonable to focus much of the management efforts in the densest networks of suitable habitat to minimise the risk of extinction of the declining species.
Resumo:
In this thesis a manifold learning method is applied to the problem of WLAN positioning and automatic radio map creation. Due to the nature of WLAN signal strength measurements, a signal map created from raw measurements results in non-linear distance relations between measurement points. These signal strength vectors reside in a high-dimensioned coordinate system. With the help of the so called Isomap-algorithm the dimensionality of this map can be reduced, and thus more easily processed. By embedding position-labeled strategic key points, we can automatically adjust the mapping to match the surveyed environment. The environment is thus learned in a semi-supervised way; gathering training points and embedding them in a two-dimensional manifold gives us a rough mapping of the measured environment. After a calibration phase, where the labeled key points in the training data are used to associate coordinates in the manifold representation with geographical locations, we can perform positioning using the adjusted map. This can be achieved through a traditional supervised learning process, which in our case is a simple nearest neighbors matching of a sampled signal strength vector. We deployed this system in two locations in the Kumpula campus in Helsinki, Finland. Results indicate that positioning based on the learned radio map can achieve good accuracy, especially in hallways or other areas in the environment where the WLAN signal is constrained by obstacles such as walls.
Resumo:
This research is connected with an education development project for the four-year-long officer education program at the National Defence University. In this curriculum physics was studied in two alternative course plans namely scientific and general. Observations connected to the later one e.g. student feedback and learning outcome gave indications that action was needed to support the course. The reform work was focused on the production of aligned course related instructional material. The learning material project produced a customized textbook set for the students of the general basic physics course. The research adapts phases that are typical in Design Based Research (DBR). The research analyses the feature requirements for physics textbook aimed at a specific sector and frames supporting instructional material development, and summarizes the experiences gained in the learning material project when the selected frames have been applied. The quality of instructional material is an essential part of qualified teaching. The goal of instructional material customization is to increase the product's customer centric nature and to enhance its function as a support media for the learning process. Textbooks are still one of the core elements in physics teaching. The idea of a textbook will remain but the form and appearance may change according to the prevailing technology. The work deals with substance connected frames (demands of a physics textbook according to the PER-viewpoint, quality thinking in educational material development), frames of university pedagogy and instructional material production processes. A wide knowledge and understanding of different frames are useful in development work, if they are to be utilized to aid inspiration without limiting new reasoning and new kinds of models. Applying customization even in the frame utilization supports creative and situation aware design and diminishes the gap between theory and practice. Generally, physics teachers produce their own supplementary instructional material. Even though customization thinking is not unknown the threshold to produce an entire textbook might be high. Even though the observations here are from the general physics course at the NDU, the research gives tools also for development in other discipline related educational contexts. This research is an example of an instructional material development work together the questions it uncovers, and presents thoughts when textbook customization is rewarding. At the same time, the research aims to further creative customization thinking in instruction and development. Key words: Physics textbook, PER (Physics Education Research), Instructional quality, Customization, Creativity
Resumo:
Modern smart phones often come with a significant amount of computational power and an integrated digital camera making them an ideal platform for intelligents assistants. This work is restricted to retail environments, where users could be provided with for example navigational in- structions to desired products or information about special offers within their close proximity. This kind of applications usually require information about the user's current location in the domain environment, which in our case corresponds to a retail store. We propose a vision based positioning approach that recognizes products the user's mobile phone's camera is currently pointing at. The products are related to locations within the store, which enables us to locate the user by pointing the mobile phone's camera to a group of products. The first step of our method is to extract meaningful features from digital images. We use the Scale- Invariant Feature Transform SIFT algorithm, which extracts features that are highly distinctive in the sense that they can be correctly matched against a large database of features from many images. We collect a comprehensive set of images from all meaningful locations within our domain and extract the SIFT features from each of these images. As the SIFT features are of high dimensionality and thus comparing individual features is infeasible, we apply the Bags of Keypoints method which creates a generic representation, visual category, from all features extracted from images taken from a specific location. A category for an unseen image can be deduced by extracting the corresponding SIFT features and by choosing the category that best fits the extracted features. We have applied the proposed method within a Finnish supermarket. We consider grocery shelves as categories which is a sufficient level of accuracy to help users navigate or to provide useful information about nearby products. We achieve a 40% accuracy which is quite low for commercial applications while significantly outperforming the random guess baseline. Our results suggest that the accuracy of the classification could be increased with a deeper analysis on the domain and by combining existing positioning methods with ours.
Resumo:
The management and coordination of business-process collaboration experiences changes because of globalization, specialization, and innovation. Service-oriented computing (SOC) is a means towards businessprocess automation and recently, many industry standards emerged to become part of the service-oriented architecture (SOA) stack. In a globalized world, organizations face new challenges for setting up and carrying out collaborations in semi-automating ecosystems for business services. For being efficient and effective, many companies express their services electronically in what we term business-process as a service (BPaaS). Companies then source BPaaS on the fly from third parties if they are not able to create all service-value inhouse because of reasons such as lack of reasoures, lack of know-how, cost- and time-reduction needs. Thus, a need emerges for BPaaS-HUBs that not only store service offers and requests together with information about their issuing organizations and assigned owners, but that also allow an evaluation of trust and reputation in an anonymized electronic service marketplace. In this paper, we analyze the requirements, design architecture and system behavior of such a BPaaS-HUB to enable a fast setup and enactment of business-process collaboration. Moving into a cloud-computing setting, the results of this paper allow system designers to quickly evaluate which services they need for instantiationg the BPaaS-HUB architecture. Furthermore, the results also show what the protocol of a backbone service bus is that allows a communication between services that implement the BPaaS-HUB. Finally, the paper analyzes where an instantiation must assign additional computing resources vor the avoidance of performance bottlenecks.