{% extends "guide.html" %} {% block guide %}
CEP, Core Experience Portal, is the core engine for all TACC portal projects. The primary goal of the Core Portal project is to establish a codebase that can be used as a springboard for all future portal projects undertaken by TACC. By establishing a common codebase for all portal efforts, we can better maintaining alignment between the core capabilities and technologies supported across all TACC portals. There will be unique requirements in some portal projects, but CEP should provide an "out-of-the-box" framework for rapidly deploying a new portal project with all the common capabilities already in place and compliant with current best practices and conventions at TACC.
Note: Any additional portal capabilities required by a project need to be identified and planned for independently.
The portal architecture operates in a tiered structure. Listed below in order from the outermost-tier and going inward, they are:
Portals using Agave to manage apps, data storage, reconfigurable workflows, and to interact with HPC resources. The architecture for the web portal that provides data management, analysis, and simulation capabilities to users through a web interface. The dashboard provides overview of jobs status, data storage, allocations, and system status. Users will primarily interact with the portal through the Workbench which will include: Manage Account, Data Files, Application Workspace, Search, and Help.
The dashboard displays availability of TACC resources, and user allocation usages. The CEP infrastructure runs on TACC hosted Virtual Machines, Django/Angular web portal with responsive layout. Every Portal project will have a dedicated VM resources. system status instrumentation will be provided via APIs. The AngularJS framework works by first reading the HTML page, which has additional custom tag attributes embedded into it. Angular interprets those attributes as directives to bind input or output parts of the page to a model that is represented by standard JavaScript variables. The values of those JavaScript variables can be manually set within the code, or retrieved from static or dynamic JSON resources.
Data Files is a collection of storage spaces where user and project data are located, stored, and ultimately organized by users to curate publications and share information. Data Repository is the place where experimental and simulation results are stored for long term. Working storage is the area to share and collaborate with data that is not yet published, Workspace to allow for the analysis of data, a gateway to large scale HPC resources and simulation tools. The data is organized in three categories:
APPS are executable code available for invocation through Agave’s Jobs service on a specific execution system. If a single code needs to be run on multiple systems, each combination and system needs to be defined as an app. Code that can be forked at the command line or submitted to a batch scheduler can be registered as an Agave app and run through the Jobs service. The user sends a request by filling in the application inputs and job details on CEP portal. The app is packaged with 3 files:
Notification Bell enables the user to view information and status of submitted jobs.
CEP has a multi-tenant capable full-text search engine with an HTTP web interface and schema-free JSON documents based on ElasticSearch. Users can search for data files, or text.
The Core portal utilizes a wide variety of technologies developed by multiple technology vendors. Below is a list of the primary libraries, frameworks and APIs leveraged in the core portal tech stack.