#the ui representation is so close to the logic data structure
Explore tagged Tumblr posts
s-lycopersicum · 1 month ago
Text
Separation of Concerns is good in theory, but have you considered that, in practice, I don't wanna do it
2 notes · View notes
qwertsypage · 4 years ago
Text
Building a Real-Time Webapp with Node.js and Socket.io
Tumblr media
In this blogpost we showcase a project we recently finished for National Democratic Institute, an NGO that supports democratic institutions and practices worldwide. NDI’s mission is to strengthen political and civic organizations, safeguard elections and promote citizen participation, openness and accountability in government.
Our assignment was to build an MVP of an application that supports the facilitators of a cybersecurity themed interactive simulation game. As this webapp needs to be used by several people on different machines at the same time, it needed real-time synchronization which we implemented using Socket.io.
In the following article you can learn more about how we approached the project, how we structured the data access layer and how we solved challenges around creating our websocket server, just to mention a few. The final code of the project is open-source, and you’re free to check it out on Github.
A Brief Overview of the CyberSim Project
Political parties are at extreme risk to hackers and other adversaries, however, they rarely understand the range of threats they face. When they do get cybersecurity training, it’s often in the form of dull, technically complicated lectures. To help parties and campaigns better understand the challenges they face, NDI developed a cybersecurity simulation (CyberSim) about a political campaign rocked by a range of security incidents. The goal of the CyberSim is to facilitate buy-in for and implementation of better security practices by helping political campaigns assess their own readiness and experience the potential consequences of unmitigated risks.
The CyberSim is broken down into three core segments: preparation, simulation, and an after action review. During the preparation phase, participants are introduced to a fictional (but realistic) game-play environment, their roles, and the rules of the game. They are also given an opportunity to select security-related mitigations from a limited budget, providing an opportunity to "secure their systems" to the best of their knowledge and ability before the simulation begins.
Tumblr media
The simulation itself runs for 75 minutes, during which time the participants have the ability to take actions to raise funds, boost support for their candidate and, most importantly, respond to events that occur that may negatively impact their campaign's success. These events are meant to test the readiness, awareness and skills of the participants related to information security best practices. The simulation is designed to mirror the busyness and intensity of a typical campaign environment.
Tumblr media
The after action review is in many ways the most critical element of the CyberSim exercise. During this segment, CyberSim facilitators and participants review what happened during the simulation, what events lead to which problems during the simulation, and what actions the participants took (or should have taken) to prevent security incidents from occurring. These lessons are closely aligned with the best practices presented in the Cybersecurity Campaigns Playbook, making the CyberSim an ideal opportunity to reinforce existing knowledge or introduce new best practices presented there.
Tumblr media
Since data representation serves as the skeleton of each application, Norbert - who built part of the app will first walk you through the data layer created using knex and Node.js. Then he will move to the program's hearth, the socket server that manages real-time communication.
This is going to be a series of articles, so in the next part, we will look at the frontend, which is built with React. Finally, in the third post, Norbert will present the muscle that is the project's infrastructure. We used Amazon's tools to create the CI/CD, host the webserver, the static frontend app, and the database.
Now that we're through with the intro, you can enjoy reading this Socket.io tutorial / Case Study from Norbert:
The Project's Structure
Before diving deep into the data access layer, let's take a look at the project's structure:
. ├── migrations │ └── ... ├── seeds │ └── ... ├── src │ ├── config.js │ ├── logger.js │ ├── constants │ │ └── ... │ ├── models │ │ └── ... │ ├── util │ │ └── ... │ ├── app.js │ └── socketio.js └── index.js
As you can see, the structure is relatively straightforward, as we’re not really deviating from a standard Node.js project structure. To better understand the application, let’s start with the data model.
The Data Access Layer
Each game starts with a preprogrammed poll percentage and an available budget. Throughout the game, threats (called injections) occur at a predefined time (e.g., in the second minute) to which players have to respond. To spice things up, the staff has several systems required to make responses and take actions. These systems often go down as a result of injections. The game's final goal is simple: the players have to maximize their party's poll by answering each threat.
We used a PostgreSQL database to store the state of each game. Tables that make up the data model can be classified into two different groups: setup and state tables. Setup tables store data that are identical and constant for each game, such as:
injections - contains each threat player face during the game, e.g., Databreach
injection responses - a one-to-many table that shows the possible reactions for each injection
action - operations that have an immediate on-time effect, e.g., Campaign advertisement
systems - tangible and intangible IT assets, which are prerequisites of specific responses and actions, e.g., HQ Computers
mitigations - tangible and intangible assets that mitigate upcoming injections, e.g., Create a secure backup for the online party voter database
roles - different divisions of a campaign party, e.g., HQ IT Team
curveball events - one-time events controlled by the facilitators, e.g., Banking system crash
On the other hand, state tables define the state of a game and change during the simulation. These tables are the following:
game - properties of a game like budget, poll, etc.
game systems - stores the condition of each system (is it online or offline) throughout the game
game mitigations - shows if players have bought each mitigation
game injection - stores information about injections that have happened, e.g., was it prevented, responses made to it
game log
To help you visualize the database schema, have a look at the following diagram. Please note that the game_log table was intentionally left from the image since it adds unnecessary complexity to the picture and doesn’t really help understand the core functionality of the game:
Tumblr media
To sum up, state tables always store any ongoing game's current state. Each modification done by a facilitator must be saved and then transported back to every coordinator. To do so, we defined a method in the data access layer to return the current state of the game by calling the following function after the state is updated:
Tumblr media
// ./src/game.js const db = require('./db'); const getGame = (id) => db('game') .select( 'game.id', 'game.state', 'game.poll', 'game.budget', 'game.started_at', 'game.paused', 'game.millis_taken_before_started', 'i.injections', 'm.mitigations', 's.systems', 'l.logs', ) .where({ 'game.id': id }) .joinRaw( `LEFT JOIN (SELECT gm.game_id, array_agg(to_json(gm)) AS mitigations FROM game_mitigation gm GROUP BY gm.game_id) m ON m.game_id = game.id`, ) .joinRaw( `LEFT JOIN (SELECT gs.game_id, array_agg(to_json(gs)) AS systems FROM game_system gs GROUP BY gs.game_id) s ON s.game_id = game.id`, ) .joinRaw( `LEFT JOIN (SELECT gi.game_id, array_agg(to_json(gi)) AS injections FROM game_injection gi GROUP BY gi.game_id) i ON i.game_id = game.id`, ) .joinRaw( `LEFT JOIN (SELECT gl.game_id, array_agg(to_json(gl)) AS logs FROM game_log gl GROUP BY gl.game_id) l ON l.game_id = game.id`, ) .first();
The const db = require('./db'); line returns a database connection established via knex, used for querying and updating the database. By calling the function above, the current state of a game can be retrieved, including each mitigation already purchased and still available for sale, online and offline systems, injections that have happened, and the game's log. Here is an example of how this logic is applied after a facilitator triggers a curveball event:
// ./src/game.js const performCurveball = async ({ gameId, curveballId }) => { try { const game = await db('game') .select( 'budget', 'poll', 'started_at as startedAt', 'paused', 'millis_taken_before_started as millisTakenBeforeStarted', ) .where({ id: gameId }) .first(); const { budgetChange, pollChange, loseAllBudget } = await db('curveball') .select( 'lose_all_budget as loseAllBudget', 'budget_change as budgetChange', 'poll_change as pollChange', ) .where({ id: curveballId }) .first(); await db('game') .where({ id: gameId }) .update({ budget: loseAllBudget ? 0 : Math.max(0, game.budget + budgetChange), poll: Math.min(Math.max(game.poll + pollChange, 0), 100), }); await db('game_log').insert({ game_id: gameId, game_timer: getTimeTaken(game), type: 'Curveball Event', curveball_id: curveballId, }); } catch (error) { logger.error('performCurveball ERROR: %s', error); throw new Error('Server error on performing action'); } return getGame(gameId); };
As you can examine, after the update on the game's state happens, which this time is a change in budget and poll, the program calls the getGame function and returns its result. By applying this logic, we can manage the state easily. We have to arrange each coordinator of the same game into groups, somehow map each possible event to a corresponding function in the models folder, and broadcast the game to everyone after someone makes a change. Let's see how we achieved it by leveraging WebSockets.
Creating Our Real-Time Socket.io Server with Node.js
As the software we’ve created is a companion app to an actual tabletop game played at different locations, it is as real time as it gets. To handle such use cases, where the state of the UI-s needs to be synchronized across multiple clients, WebSockets are the go-to solution. To implement the WebSocket server and client, we chose to use Socket.io. While Socket.io clearly comes with a huge performance overhead, it freed us from a lot of hassle that arises from the stafeful nature of WebSocket connections. As the expected load was minuscule, the overhead Socket.io introduced was way overshadowed by the savings in development time it provided. One of the killer features of Socket.io that fit our use case very well was that operators who join the same game can be separated easily using socket.io rooms. This way, after a participant updates the game, we can broadcast the new state to the entire room (everyone who currently joined a particular game).
To create a socket server, all we need is a Server instance created by the createServer method of the default Node.js http module. For maintainability, we organized the socket.io logic into its separate module (see: .src/socketio.js). This module exports a factory function with one argument: an http Server object. Let's have a look at it:
// ./src/socketio.js const socketio = require('socket.io'); const SocketEvents = require('./constants/SocketEvents'); module.exports = (http) => { const io = socketio(http); io.on(SocketEvents.CONNECT, (socket) => { socket.on('EVENT', (input) => { // DO something with the given input }) } }
// index.js const { createServer } = require('http'); const app = require('./src/app'); // Express app const createSocket = require('./src/socketio'); const port = process.env.PORT || 3001; const http = createServer(app); createSocket(http); const server = http.listen(port, () => { logger.info(`Server is running at port: ${port}`); });
As you can see, the socket server logic is implemented inside the factory function. In the index.js file then this function is called with the http Server. We didn't have to implement authorization during this project, so there isn't any socket.io middleware that authenticates each client before establishing the connection. Inside the socket.io module, we created an event handler for each possible action a facilitator can perform, including the documentation of responses made to injections, buying mitigations, restoring systems, etc. Then we mapped our methods defined in the data access layer to these handlers.
Bringing together facilitators
I previously mentioned that rooms make it easy to distinguish facilitators by which game they currently joined in. A facilitator can enter a room by either creating a fresh new game or joining an existing one. By translating this to "WebSocket language", a client emits a createGame or joinGame event. Let's have a look at the corresponding implementation:
// ./src/socketio.js const socketio = require('socket.io'); const SocketEvents = require('./constants/SocketEvents'); const logger = require('./logger'); const { createGame, getGame, } = require('./models/game'); module.exports = (http) => { const io = socketio(http); io.on(SocketEvents.CONNECT, (socket) => { logger.info('Facilitator CONNECT'); let gameId = null; socket.on(SocketEvents.DISCONNECT, () => { logger.info('Facilitator DISCONNECT'); }); socket.on(SocketEvents.CREATEGAME, async (id, callback) => { logger.info('CREATEGAME: %s', id); try { const game = await createGame(id); if (gameId) { await socket.leave(gameId); } await socket.join(id); gameId = id; callback({ game }); } catch (_) { callback({ error: 'Game id already exists!' }); } }); socket.on(SocketEvents.JOINGAME, async (id, callback) => { logger.info('JOINGAME: %s', id); try { const game = await getGame(id); if (!game) { callback({ error: 'Game not found!' }); } if (gameId) { await socket.leave(gameId); } await socket.join(id); gameId = id; callback({ game }); } catch (error) { logger.error('JOINGAME ERROR: %s', error); callback({ error: 'Server error on join game!' }); } }); } }
If you examine the code snippet above, the gameId variable contains the game's id, the facilitators currently joined. By utilizing the javascript closures, we declared this variable inside the connect callback function. Hence the gameId variable will be in all following handlers' scope. If an organizer tries to create a game while already playing (which means that gameId is not null), the socket server first kicks the facilitator out of the previous game's room then joins the facilitator in the new game room. This is managed by the leave and join methods. The process flow of the joinGame handler is almost identical. The only keys difference is that this time the server doesn't create a new game. Instead, it queries the already existing one using the infamous getGame method of the data access layer.
What Makes Our Event Handlers?
After we successfully brought together our facilitators, we had to create a different handler for each possible event. For the sake of completeness, let's look at all the events that occur during a game:
createGame, joinGame: these events' single purpose is to join the correct game room organizer.
startSimulation, pauseSimulation, finishSimulation: these events are used to start the event's timer, pause the timer, and stop the game entirely. Once someone emits a finishGame event, it can't be restarted.
deliverInjection: using this event, facilitators trigger security threats, which should occur in a given time of the game.
respondToInjection, nonCorrectRespondToInjection: these events record the responses made to injections.
restoreSystem: this event is to restore any system which is offline due to an injection.
changeMitigation: this event is triggered when players buy mitigations to prevent injections.
performAction: when the playing staff performs an action, the client emits this event to the server.
performCurveball: this event occurs when a facilitator triggers unique injections.
These event handlers implement the following rules:
They take up to two arguments, an optional input, which is different for each event, and a predefined callback. The callback is an exciting feature of socket.io called acknowledgment. It lets us create a callback function on the client-side, which the server can call with either an error or a game object. This call will then affect the client-side. Without diving deep into how the front end works (since this is a topic for another day), this function pops up an alert with either an error or a success message. This message will only appear for the facilitator who initiated the event.
They update the state of the game by the given inputs according to the event's nature.
They broadcast the new state of the game to the entire room. Hence we can update the view of all organizers accordingly.
First, let's build on our previous example and see how the handler implemented the curveball events.
// ./src/socketio.js const socketio = require('socket.io'); const SocketEvents = require('./constants/SocketEvents'); const logger = require('./logger'); const { performCurveball, } = require('./models/game'); module.exports = (http) => { const io = socketio(http); io.on(SocketEvents.CONNECT, (socket) => { logger.info('Facilitator CONNECT'); let gameId = null; socket.on( SocketEvents.PERFORMCURVEBALL, async ({ curveballId }, callback) => { logger.info( 'PERFORMCURVEBALL: %s', JSON.stringify({ gameId, curveballId }), ); try { const game = await performCurveball({ gameId, curveballId, }); io.in(gameId).emit(SocketEvents.GAMEUPDATED, game); callback({ game }); } catch (error) { callback({ error: error.message }); } }, ); } }
The curveball event handler takes one input, a curveballId and the callback as mentioned earlier. The performCurveball method then updates the game's poll and budget and returns the new game object. If the update is successful, the socket server emits a gameUpdated event to the game room with the latest state. Then it calls the callback function with the game object. If any error occurs, it is called with an error object.
After a facilitator creates a game, first, a preparation view is loaded for the players. In this stage, staff members can spend a portion of their budget to buy mitigations before the game starts. Once the game begins, it can be paused, restarted, or even stopped permanently. Let's have a look at the corresponding implementation:
// ./src/socketio.js const socketio = require('socket.io'); const SocketEvents = require('./constants/SocketEvents'); const logger = require('./logger'); const { startSimulation, pauseSimulation } = require('./models/game'); module.exports = (http) => { const io = socketio(http); io.on(SocketEvents.CONNECT, (socket) => { logger.info('Facilitator CONNECT'); let gameId = null; socket.on(SocketEvents.STARTSIMULATION, async (callback) => { logger.info('STARTSIMULATION: %s', gameId); try { const game = await startSimulation(gameId); io.in(gameId).emit(SocketEvents.GAMEUPDATED, game); callback({ game }); } catch (error) { callback({ error: error.message }); } }); socket.on(SocketEvents.PAUSESIMULATION, async (callback) => { logger.info('PAUSESIMULATION: %s', gameId); try { const game = await pauseSimulation({ gameId }); io.in(gameId).emit(SocketEvents.GAMEUPDATED, game); callback({ game }); } catch (error) { callback({ error: error.message }); } }); socket.on(SocketEvents.FINISHSIMULATION, async (callback) => { logger.info('FINISHSIMULATION: %s', gameId); try { const game = await pauseSimulation({ gameId, finishSimulation: true }); io.in(gameId).emit(SocketEvents.GAMEUPDATED, game); callback({ game }); } catch (error) { callback({ error: error.message }); } }); } }
The startSimulation kicks the game's timer, and the pauseSimulation method pauses and stops the game. Trigger time is essential to determine which injection facilitators can invoke. After organizers trigger a threat, they hand over all necessary assets to the players. Staff members can then choose how they respond to the injection by providing a custom response or choosing from the predefined options. Next to facing threats, staff members perform actions, restore systems, and buy mitigations. The corresponding events to these activities can be triggered anytime during the game. These event handlers follow the same pattern and implement ourthe three fundamental rules. Please check the public GitHub repo if you would like to examine these callbacks.
Serving The Setup Data
In the chapter explaining the data access layer, I classified tables into two different groups: setup and state tables. State tables contain the condition of ongoing games. This data is served and updated via the event-based socket server. On the other hand, setup data consists of the available systems, game mitigations, actions, and curveball events, injections that occur during the game, and each possible response to them. This data is exposed via a simple http server. After a facilitator joins a game, the React client requests this data and caches and uses it throughout the game. The HTTP server is implemented using the express library. Let's have a look at our app.js.
// .src/app.js const helmet = require('helmet'); const express = require('express'); const cors = require('cors'); const expressPino = require('express-pino-logger'); const logger = require('./logger'); const { getResponses } = require('./models/response'); const { getInjections } = require('./models/injection'); const { getActions } = require('./models/action'); const app = express(); app.use(helmet()); app.use(cors()); app.use( expressPino({ logger, }), ); // STATIC DB data is exposed via REST api app.get('/mitigations', async (req, res) => { const records = await db('mitigation'); res.json(records); }); app.get('/systems', async (req, res) => { const records = await db('system'); res.json(records); }); app.get('/injections', async (req, res) => { const records = await getInjections(); res.json(records); }); app.get('/responses', async (req, res) => { const records = await getResponses(); res.json(records); }); app.get('/actions', async (req, res) => { const records = await getActions(); res.json(records); }); app.get('/curveballs', async (req, res) => { const records = await db('curveball'); res.json(records); }); module.exports = app;
As you can see, everything is pretty standard here. We didn't need to implement any method other than GET since this data is inserted and changed using seeds.
Final Thoughts On Our Socket.io Game
Now we can put together how the backend works. State tables store the games' state, and the data access layer returns the new game state after each update. The socket server organizes the facilitators into rooms, so each time someone changes something, the new game is broadcasted to the entire room. Hence we can make sure that everyone has an up-to-date view of the game. In addition to dynamic game data, static tables are accessible via the http server.
Next time, we will look at how the React client manages all this, and after that I'll present the infrastructure behind the project. You can check out the code of this app in the public GitHub repo!
In case you're looking for experienced full-stack developers, feel free to reach out to us via [email protected], or via using the form below this article.
You can also check out our Node.js Development & Consulting service page for more info on our capabilities.
Building a Real-Time Webapp with Node.js and Socket.io published first on https://koresolpage.tumblr.com/
0 notes
rajon007 · 8 years ago
Text
All you need is data
Interview Jan Kratochvil / Simon Denny for Rajon 5
Tumblr media
I read in the interview of Hans Ulrich with Julian Assange about his concept of three types of history: First, knowledge, like how to refine oil for instance or how to make a plastic bottle and so on, which are maintained and sustained by production, economy around it. Second, historical records, telling us various stories from the prehistory till today, being always present, slowly disintegrating or being reinterpreted, thus manipulated, but without an existing intention to get rid of them. Last type is something people put lots of energy and economic power to willingly destroy it or keep it as a secret. The last type is obviously something Assange is interested in. In your case it’s more complicated I would say, even though you were working with Snowden files, these leaked informations are not the single core of your interest. Compiling past and present, even if past means yesterday and present possibly could be perceived as tomorrow. So what highlights topics of interest for you work? First of all thanks for the careful questions - they are complex, so my answers might also be. I hope you'll forgive that! Secondly I love that interview, its my 2nd favorite Assange and my favorite Obrist interview. I also read in the "When google met Wikileaks" book that it was an interview Assange also felt very happy with. I am very interested in the way organizations, particularly tech-related organizations, present themselves to each other and to the world. I'm interested in the language they use and the information they prioritize, as well as they way they use images, objects and systems in service of these priorities. This usually involves telling stories of some kind. If we want to relate it to your interpretation of Assange's three histories, maybe this activity relates more to the second and third kind of history you identify here. We could also say most tech companies implicitly or explicitly have claim on the first kind here too. I guess if I had to choose, I am most interested in your/his second kind of history. I feel like the kind of history-storytelling my exhibitions hover around and frame is about imaging the way we might see recent history through characters from the present. So in more concrete terms, my exhibitions often involve picking a organization, a practice or an individual, and reiterating or recontextualizing an existing story through them. With the Snowden-related material, I chose a Creative Director to recast as an artistic master in a longer lineage of state-commissioned images, using themes and aesthetic memes to unpack the value systems that might be found in the intelligence community's visual choices. In my Serpentine exhibition, Products for Organizing, I played many voices against each other, trying to visualize the relationship between a marketing-oriented view on the history of hacking and how that might be used to service commercial and governmental organizational innovation. In both cases histories of a present were told from a biased position. As you say I compile recent history and kind of posit a view on the present and past that demonstrates its interests.
Tell me more about this always present durational aspect of your work. Passage of time around us is super fast nowadays and iOS6 (was it 6?) with skeuomorphic design already looks like an antiquity when you’re using iOS9. More subtle changes in UI like the change of the typefont in case of Apple from Helvetica to San Francisco is less visible for most of costumers, still you reflect on it. What does this timeline you’re creating actually saying besides the obvious? I believe in design as a time-stamp. I think objects, graphics, fonts and GUI's capture a moment in a very rich way. Popular interfaces to communication carry something of a worldview and a representation of what's possible and what's important at a certain moment. In this way Tim Cook's decision to have a custom typeface not a modernist classic as the universal system font of one of the world’s most dominant platforms says something about the world in 2015. Maybe this could represent a look inwards for the powerful tech giant? The fact that iOS was skeuomorphic also says something about 2007-2012. Maybe we were learning to use and carry touch screen portals or learning to want them. Environments you’re forming holds the essence of some utopian repositories of knowledge. Very specifically selected knowledge. Do you relate to some ancient utopian urban plans and structures? I mean, besides the Tower of Babel. Thinking about New Atlantis, De Civitate Dei, Moore’s Utopia, Civitas Solis, Civitas Veri and so on. In other words, is there a long-term political ambition behind organising all that data into exhibition set-ups? (Funny thing is that you’re mentioning in youtube guided tour through “Babel”, that the idea of a tower came from the curator, so I’m just not sure if it’s something you would yourself find interesting or if it comes out of a process of preparing the show with another person). Yeah in this case, the babel commission really came out of a conversation with Daniel Birnbaum and Hand Ulrich Obrist, along with Luma, where me and Alessandro Bava worked on researching and reinterpreting not only the tower of Babel but also a history of radical exhibition making and design at the Moderna Museet. So that was really a very group-authored thing - which also involved performances and poetry crated by Simon Castets and Giovanna Olmos and many more people. So while it was an amazing project that I am really proud of, and I took it in directions close to my personal interests where it made sense, it was also about learning from other voices and approaches, and the Babel proposition was one of those things that originated with another voice.
Can you please elaborate more on the question about the tower. i’m interested in a way how you think about those specific set-ups of your work with changes and differences you’re making for different shows. do you consider those changes (for example different statues in exhibiting dotcom project etc.) to be a result of some specific system which develops the narration or are those mostly random? and what about reiterations of projects in different context: “venice” in kunsthalle vienna for instance?
Changes to how my material is presented in different situations aren’t random Exhibitions appearing in different venues are also not based on a system of rules that carry across every presentation. I look at each exhibition opportunity as it comes up and think what fits best, within what’s possible in terms of time and resources and also what the situation demands or proposes. A group show with a curatorial voice is not the same as solo presentation. With the Personal Effects of Kim Dotcom this was kind of written into the work – in that case, the reiteration was kind of a system. Each time the show is made, the host institution and I gather material as best one can, according to the list of confiscated Dotcom possessions. That always reflects a budget, time, ingenuity, and effort – all sorts of factors that change from place to place. The contrast with what it would be to present the “real” collection is always huge. But this discrepancy is folded into the logic of the project, where the gap between the crazy value of Dotcom’s collection is always underlined. He is a very wealthy man and his business and lifestyle have always been about performing material success to a certain extent. Art budgets from New Zealand to Austria cannot match this, at least not within the framework I have been able to create. That is part of a work that is about articulating copies and placeholders for value. With my Venice project’s sister participation in a group show at the Kunsthalle Wien it was also a lot about what the curator Nicolaus Schafhausen was interested in and what worked within the constraints of what he had in mind. It was a much more general presentation, with less of the pointed tensions of the presentation in Venice emphasised – but that’s what the show seemed to require. So its different in each an every situation. But that doesn’t mean what is presented is random. What about the idea of constructing repositories of knowledge? how could this gathering of data work much later when those informations are not current any more? just basically if you’re thinking about it as a statement of here and now or if you think about a universally usable system of data distribution and interpretation.
I think this relates to your question earlier about design for me – what about when design ages and looks out of date? For me this question about data and relevance of events and data of another time is the same. We value cultural objects of the past that contain beautiful reflections of the place/time they were created in. The logic of those objects becomes a summary of what is important to the people/forces that made them. Reflecting current events and ideas and the way things change for me is about that entering into the presentations I make. If something has a strong resonance now it will be valuable in some way in a future that cares about the past. Do you write? I mean in terms of essayistic format. (Haven’t found anything, but it could be really interesting). Unfortunately not so much. Most of my writing is inside my artwork - I very often write or contribute significantly to press releases and wall texts/didactics. There is also increasingly a lot of text annotation on sculptures and paintings, which I have been co-writing with Matt Goerzen. More long form essays are something Ive not really had the time to write up until now. And its a craft I don't really know, nor have I worked on. You were saying once that visitor can interpret your work in his own way, obviously, that one can interpret it either as a serious political/ economical critique, and also just as a parody or some kind of nostalgia-aesthetics joke. Well, Simon, tell me please, where is the boundary in between joke and critical work? That’s obviously a quite an issue in realm of “post-postinternet”. With an obvious example of Hito Steyerl and difference in between her original documentary work, her essays, lectures and…for instance Factory of the Sun. I think I would define my terms a bit here before I comment on this. I think ultimately the viewer always completes the artwork in the act of experiencing it, which is a conventional assertion in contemporary art, right? I also would say that I find the tone of other artist's work addressing similar topics is often divergent. I am a fan of Hito Steyerl, I think she does an amazing job. I don’t read humor in her work as something that obscures critique. I also see her work as something that contains critique as a central structural element. Now to the question whether my work is a joke or not I would firmly say no it is not. I think any kind of exploration of a topic can involve humor, or elements that propose unlikely assertions. That doesn't necessarily follow that the whole endeavor is therefore a joke. It’s just a language of exhibition making that has a range to it. I am always a fan of the material I use in my work though - and if any humorous elements emerge its always out of playful admiration rather than anything close to sarcastic critique.
Tumblr media
Simon Denny (b. 1982 in Auckland, New Zealand) is an artist working with installation, sculpture and video. He studied at the the Elam School of Fine Arts at the University of Auckland, New Zealand and at the Städelschule Frankfurt. selected solo exhibitions include: Serpentine Gallery, London (2015); MoMA PS1, New York (2015); Portikus, Frankfurt (2014); MuMOK, Vienna (2013); Kunstverein Munich, Munich (2013); and Aspen Art Museum, Aspen (2012). In 2012, Denny was awarded the Art Basel Statements Balouse Preis. Selected group shows include: After Babel, Moderna Museet, Stockholm (2015); Europe, Europe, Astrup Fearnley Museet, Oslo (2014); Art Post-Internet, Ullens Center for Contemporary Art, Beijing (2014); Speculations on Anonymous Materials, Fredericianum, Kassel (2013); Image into Sculpture, Centre Pompidou, Paris (2013); and Remote Control, ICA, London (2012). Denny represented New Zealand at the 56th Venice Biennale (2015) and was included in the central curated exhibition in 2013. He participated in the 13th Lyon Biennale (2015), Montreal Biennale (2014), as well as the Sydney Biennale and the Brussels Biennale (both in 2008).
Tumblr media
1 note · View note
t-baba · 7 years ago
Photo
Tumblr media
AngularJS and Angular 2+: a Detailed Comparison
This article compares the major differences between the the original AngularJS and Angular 2+. If you’re currently stuck with an AngularJS project and not sure whether you should make the jump, this article should help you get started.
In recent years, we’ve seen Angular grow tremendously as a framework and as a platform for developing single page applications (SPAs) and progressive web apps (PWAs). AngularJS was built on top of the idea that declarative programming should be used for building the views. This required decoupling the DOM manipulation from the business logic of the application and the approach had many benefits on its own.
However, AngularJS had many shortcoming in terms of performance and how things worked under the hood. Hence, the development team spent a year rewriting the code from scratch and finally released Angular 2 in late 2016. Most developers felt that Angular 2 was a different platform that had very little resemblance to the original AngularJS.
So let’s compare and contrast AngularJS and Angular 2+.
Frameworks in AngularJS and Angular 2
AngularJS follows the traditional MVC architecture that comprises a model, a view and a controller.
Controller: the controller represents how user interactions are handled and binds both the model and the view.
Views: the view represents the presentation layer and the actual UI.
Model: the model is an abstract representation of your data.
Some developers are of the opinion that AngularJS follows MVVM pattern that replaces the Controller with a View-Model. A View-Model is a JavaScript function that’s similar to that of the controller. What makes it special is that it synchronizes the data between a view and a model. The changes made to a UI element automatically propagate to the model and vice versa.
The following diagram shows how various AngularJS pieces are connected together.
Tumblr media
You can read more about AngularJS’s architecture on the official documentation page.
Angular, on the other, hand has a component-based architecture. Every Angular application has at least one component known as the root component. Each component has an associated class that’s responsible for handling the business logic and a template that represents the view layer. Multiple, closely related components can be stacked together to create a module and each module forms a functional unit on its own.
Tumblr media
As you can see in the figure, the component is bound to the template. Components are composed using TypeScript classes and templates are attached to them using @Component annotations. Services can be injected into a component using Angular’s dependency injection subsystem. The concept of modules in Angular is drastically different from that of the AngularJS modules. An NgModule is a container for defining a functional unit. An NgModule can comprise components, services and other functions. The modular unit can then be imported and used with other modules.
All the Angular concepts are better explained at Angular.io.
Templates in AngularJS and Angular 2
In AngularJS the template is written using HTML. To make it dynamic, you can add AngularJS-specific code such as attributes, markups, filters and form controls. In addition, it supports the two-way data binding technique mentioned earlier. The following code snippet demonstrates the use of directives and double curly brackets within the template:
<html ng-app> <!-- Body tag augmented with ngController directive --> <body ng-controller="MyController"> <inpu#t ng-model="foo" value="bar"> <!-- Button tag with ngClick directive --> <!-- Curly bracket is a template binding syntax --> button ng-click="changeFoo()"></button> <script src="angular.js"></script> </body> </html>
In Angular, AngularJS’s template structure was reworked and lots of new features were added to the templates. The primary difference was that each component had a template attached to it. All the HTML elements except <html>, <body>, <base>, and <script> work within the template. Apart from that, there are features such as template binding, template interpolation, template statements, property binding, event binding and two-way binding. Built-in attribute directives like NgClass, NgStyle and NgModel and built-in structural directives such as NgIf, NgForOf, NgSwitch are also part of the template.
Continue reading %AngularJS and Angular 2+: a Detailed Comparison%
by Manjunath M via SitePoint https://ift.tt/2qaRPEg
0 notes