#digitalcuration
Explore tagged Tumblr posts
rockinjohnny · 6 months ago
Text
youtube
0 notes
pinetworknewupdate9121987 · 11 months ago
Video
youtube
Hot news: hokanews :"JUST IN: Pi Network, The New Crypto Breakthrough Th...
0 notes
kcostanz · 5 years ago
Text
What is aesthetic consumerism?
It’s easy to agree with Sontag’s claim without pausing to understand a critical term within it. 
Aesthetics: a branch of philosophy that deals with the nature of beauty and taste, as well as the philosophy of art.
Consumerism: a social and economic order that encourages the acquisition of goods and services in ever-increasing amounts.
Our growing need to “confirm reality and enhance experience” with photographs leads us to curate a collection of pictures, ideas, and sentiments as a way of self expression, seeking external validation of a personal identity. 
Creating this branded image of ourselves through photos and online profiles allows the public to access, perceive, and sometimes validate that existence. Demanding social pressures urge us to join in this personal curation- not participating prohibits a vital element of 21st century interconnection. Together these forces drive a cyclical demand to develop and contribute to this personal brand, fostering what Sontag identifies as an obsession with aesthetic consumerism.
Tumblr media
(A snapshot of my Instagram- an aesthetically driven portrayal of identity.)
2 notes · View notes
southvision · 6 years ago
Photo
Tumblr media
Brooklyn Museum NY 23-05-19 # #jetlag #visualpactisioner #artivisit #curation #digitalcuration #artisttravel #NewYork #Newyorkbyday #broonklyn #Brooklyny #broonklynmuseum #usa #phonephotograpy #doogee #Johannesburg #gauteng #SouthAfrica (at Brooklyn Museum) https://www.instagram.com/p/Bx-_y8JDO9U/?igshid=m8bhz70rhy8b
1 note · View note
strathoa · 8 years ago
Text
Progress in the Digital Curation of Research Datasets
Part 2
Alan Slevin, Open Access and Research Data Manager, University of Strathclyde
This is Part 2 of a blog post on developments in the preservation and curation of datasets in the University.
After the end of our Archivematica project (discussed in Part 1) we have continued to develop our manual curation workflow, and work on automating steps where possible.
Initially we have had a fairly simple manual process in mind.
dataset upload to PURE
saving dataset to Archivematica transfer folder
processing through Archivematica (customised Archivematica workflow – format registry and administrator options)
extracting DIP from Archivematica storage area
re-upload to PURE
Checksum monitoring, handling large datasets, and the longer-term and ongoing preservation of files would also complicate the standard workflow.
Archivematica Ingest Workflow
These steps follow the routine deposit, editing, DOI minting and validation of the PURE dataset record and files.
The manual curation workflow has been perfected which involves removing datasets deposited in PURE and transferring them to the queue folder for Archivematica ingest. We process SIPs through the various micro-services and use the storage service to extract the DIP. The DIP is re-uploaded back into PURE. The AIP containing the DIP and SIP is stored. Fine tuning this approach has in the last few months led to two strands of technical development.
Checksums
The way dataset files are stored in PURE makes the monitoring of checksums difficult to achieve. In PURE, there is no indication in the checksum file of the dataset (or any file) to which they pertain. The file shows a checksum and a file-size but it’s not possible to determine which bit stream this applies to.  A solution might involve storing all dataset checksums and file sizes for comparison with ingested datasets, but again this lack of curation friendliness and interoperability in PURE inhibits the ability to maintain and monitor checksums throughout the workflow. Unless we can get PURE to describe which .bin file relates to which bit stream, the datasets can’t be processed with their own checksums through Archivematica.
Ideally to operate a chain of custody for checksums we would start the monitoring process using the checksum as it originates at the first point in our workflow which is after deposit in PURE. As mentioned there is a basic problem in retrieving these checksums from PURE, so for the purposes of our current workflow we identified where the checksums were retrievable for the datasets transferred to the Archivematica transfer source folder. We can generate a checksum manually for the zip and include it in the transfer for Archivematica (and/or Archivematica will generate one at ingest). However, checksums are only really useful if they are being monitored for change. Leaving the checksums to be stored with the associated METS file and managed in Archivematica might be more practical than storing them with the re-uploaded file in PURE. Decisions on what level to record the checksum in terms of file granularity are still to be confirmed but storing along with the file would only really be useful if it was being monitored for changes. This is one of the drawbacks of our mainly non-interoperable infrastructure.
Dataset Transfer
How to get data from researchers after deposit in PURE into the Archivematica transfer source is an important aspect of our digital preservation workflow. Data is most at risk during transfer between different systems.  
We are investigating ways to automate parts of the workflow and also to address issues relating to the reformatting of zip deposits after DIPs are created. For the former, the work developed at Lancaster on PUREE Gems which interrogate the xml exported by the PURE REST service offers the possibility of transferring dataset metadata and files without manual involvement to the transfer queue. A second Gem presents the metadata for ingest to Archivematica. We need to decide which Dublin Core and relational metadata should be transferred from PURE and what preservation and administrative metadata might be appended in a .csv file. We are engaging our local technical support team in this work with the aim to convert the RAILS coding used to create PUREE GEMS to C+ edge as this reflects local expertise and specialisms.
Another facet of technical development might involve reformatting the ingested DIP in the ZIP format in which most deposits are made. This could involve quite a detailed change to the software itself but there is also a possibility of holding all the original and normalised files in the AIP which is then stored.
The original intention was for normalised dissemination copies to be re-uploaded back to PURE. However this is a problem for zip files which is the main way that datasets are deposited by academics. While the contents of the zip file are normalised in by Archivematica, the normalised versions are not automatically re-zipped in the way that the files were originally organised. There is functionality with Archives (ICA AtoM) for reconstituting zip files from processed versions, but no equivalent as yet for the datasets pipeline. There has been discussions on pursuing this area as a modification within Archivematica.
An alternative approach could be to follow the Bentley Historical Library model in Archivematica 1.6. and therefore to minimise the development work in the short-term. This means any ingested zip and the normalised versions of files are all accessible in one zip file. This does sound attractive for our purposes considering we are getting a lot of proprietary files which are untouched by Archivematica. This zip file contains the contents of the objects directory of the AIP, and has all the original files and any normalised versions of those files in them. So in this way where no normalisation takes place the original file is available, although it has been through Archivematica virus-checking, check-summing and file characterisation processes. Also, any normalised versions are also present including normalised versions of the files contained in the original zip.
Metadata
Archivematica provides multiple ways to get metadata into the AIP METS profile, for example the PREMIS rights form for entry during ingest. We need to explore the possibility of using the Archivematica metadata fields as a supplement to the PURE metadata record in the short-term, and/or explore ways to parse and report based on the METS file. If we are able to transfer files using the new PURE REST services we could augment DC metadata from PURE with administrative and preservation metadata from Archivematica in an attached .csv file.
The AIP itself contains (in addition to the ingested digital objects) preservation versions of the objects, submission documentation, logs, and the METS file. As described, in the absence of a solution on reformatting normalised zip files for access we have considered offering this zip file for access, a capability now possible in the latest version (1.6) of Archivematica.
Archivematica uses a Format Policy Registry (FPR) to provide rules on how particular file formats are processed. While the normalisation rules used in the FPR were generally in line with our own data deposit policy we should aim to list preferred, supported and unsupported formats through our own tailored FPR. In so doing we should aim to customise Archivematica throughout its different sub-services in order to identify files using the most suitable tool; characterise, validate and normalise formats appropriately, and store relevant PREMIS (events) metadata important to the understanding of the dataset.
During testing and in noting the lack of reporting functionality, our attention was drawn to a new Archivematica fixity app which might merit further investigation for our workflow. This app is able to monitor the checksums in archival storage. If a DIP in PURE becomes corrupted and we need to replace it, we can re-ingest the AIP for the purpose of DIP generation. In the absence of intelligent object storage which might do this job and the fact that Archivematica itself does not monitor ongoing file integrity, such a tool could prove to be very useful if applied properly. During the project Artefactual invited us to write up our own requirements in this area for possible development recognising that each HEI’s workflow is likely to be different.
PURE User’s Future?
While all these developments continue we must keep a close eye on the broader picture with the prospect of a systems review in the near future. We might continue to use PURE in the same way. Considering Elsevier’s apparent focus now on Mendeley for Data (and integrating it with PURE) their continued support for the datasets module in PURE, and particularly in its further development, is questionable. So we need to consider the implications of a PURE-Mendeley-DANS workflow – are all the Data Lifecycle stages covered, what are the cost and workflow implications, how might researchers respond to a two-way deposit process? Then of course we are waiting for practical and interoperable solutions from the JISC Shared Services project or at least useful ideas which might be incorporated within our existing network of supporting tools.
The priority is to promote engagement throughout the data lifecycle and to provide systems which are easy for the academic to use and which address the main funder mandated requirements. At the dataset side we can respond quickly to DOI requirements for datasets via the datasets module in PURE and provide a repository for retained data and metadata. Our secure and duplicated storage safeguards data while our work with Archivematica while still in development, provides evidence that we are being alive in addressing the requirements for long-term preservation and active curation.
Longer term developments in PURE and local infrastructure have the potential to affect interoperability issues, while development work involving Archivematica users and the wider Digital Preservation community, who may also use PURE and associated tools, is continuing. We should get used to working with Archivematica in our existing workflow with real datasets while monitoring the broader picture. A fully fledged RDM solution is likely to consist of a variety of different systems performing different functions within the workflow; Archivematica will fit well into this modular architecture.
We have already met impediments to local development centred on the availability of local IT support and so progress is hard to gauge at this time. Also, work with Artefactual to customise the software would be on a different contract from our current maintenance agreement. Fortunately the parallel work in different JISC projects promises some relevant developments and we will stay involved through the Archivematica User Group.
While we have much customisation of Archivematica ahead and more to discover about the suitability of its many processes to our workflow, we can say at this stage that it provides the preservation workflow tools to support our Research Data Deposit Policy and our curation and preservation strategies. It allows us to set and use our storage pathways appropriately and fills a gap in the current services supporting the RDM data lifecycle.
The key requirements for further development are better interoperability if PURE remains the point of data deposit and further customisation of the system to ensure a good balance, depending on the file formats in question, between the effectiveness of the automated curation processes and the human intervention which is still crucial at key decision points in the workflow.
Mediation and Data Quality Issues
There are other soft or socio-technical issues to do with dataset deposit and data quality checking which must be factored into workflow decisions. These data management routines would take place at the initial deposit stage but will affect what is sent to Archivematica.
Depositor agreement – can we generate a click-through agreement? If not, do we need some signed paperwork from the academic? This is being looked at in the PURE user group.
Notwithstanding the Legal Requirements metadata in PURE, the administrator should refer to the DMP in checking on the disclosure of personal data.
Granularity: when do we suggest files should be organised as collections with different (child) templates created according to particular themes? This might be related to individual publications related to datasets.
Applying pre-ingest checks in a self-deposit workflow continues to be an issue. How stringent should we be on ‘quality’ checks e.g. folder names, (lack of) documentation, file structure evident? (with or without a DMP being available). Having a fairly rigorous approach to this stage probably mean contacting the depositor in most cases. The reuse potential of the data may be factor in this area.
Particularly with the last point here on documenting and checking datasets, we are aware that while Archivematica assesses, virus checks, characterises and where appropriate (according to the Format Registry) normalises files, the actual organisation and naming of files and folders, the structure of zip files, the existence of accompanying documentation to promote data reuse and replicability is a ‘human’ area of checking which is not generally covered by curatorial administrators. Our efforts in promoting best practice among researchers in these areas with training and good pre-ingest guidance is the main technique while self-deposit of datasets remains the norm.
In summary, we believe that we have made progress in this area and therefore in addressing funder expectations in the preservation and curation of research datasets to ensure their long-term accessibility. However, there are obvious gaps in the interoperability of systems and our own processes which need to be overcome in order to provide services which are truly reflective of OAIS best practice. We are operating in a limited technical environment in terms of the tools available and the technical support at hand, but we have developed a number of steps which will ensure that the datasets deposited here are preserved for longer term access. While we are still operating on the basis that PURE remains the point of self-deposit, these limitations will remain. The Elsevier proposal for using Mendeley Data as a more fully functioning and integrated Data Repository linked to PURE (which will retain the data registry role for compliance and reporting) might change this dynamic. However, it seems safe to assume that this decision – in tandem with the JISC Shared Services work – should follow on from a comprehensive evaluation of the interlinking systems available as this area of RDM continues to develop.
1 note · View note
rbastien1234 · 3 years ago
Photo
Tumblr media
#vancouverartgallery #abstractpaintingart #installationart #digitalcurator #blockchain #digitalcryptoary https://www.instagram.com/p/Ce9md__POv-/?igshid=NGJjMDIxMWI=
0 notes
shreehq · 4 years ago
Photo
Tumblr media
We have simplified your journey of growth with the most trending technology at the moment.  Be a part of the SHREE Ecosystem to take a leap towards growth. #श्री #Shree #ShreeCoin #ShreeWallet #Blockchain #Bitcoin #bitcoinews #FinTech #Trading #lowcost #investment #investmentoptions  #digitalmoney #digitalcurrency #tokens #coins
1 note · View note
orum · 4 years ago
Photo
Tumblr media
Reposted from @klyngecontemporary Ebeltoft after The Flood ⠀⠀⠀⠀⠀⠀⠀⠀⠀ Solo show by Kristoffer Ørum at Oxer Svartlöga ⠀⠀⠀⠀⠀⠀⠀⠀⠀ Photos by Mikkel Høgh Kaldal ⠀⠀⠀⠀⠀⠀⠀⠀⠀ @kristofferorum @oxer_svartloga #ebeltoftaftertheflood #kristofferørum #optimisticapocalypse #archaeology #oyesters #climatechange #archipelago #algorithms #song #video #sculpture #installation #digitalcurating #scandinavianart #nordicart #contemporaryart #nydanskkunst #kunst #konst #samtidskunst #samtidskonst (at Svartlöga) https://www.instagram.com/p/CWnMTP0svpv/?utm_medium=tumblr
0 notes
nftmarket · 4 years ago
Photo
Tumblr media
Colexion NFT- Exclusive combo packs
Gain multiple access and special powers at the same time with the Colexion combo pack including black, gold, and platinum cards.
visit: colexion.io/
Instagram: https://www.instagram.com/colexion.nft/
0 notes
debajitadhikary · 4 years ago
Text
Cryptocurrency Exchange Binance to Stop Support for Stock Tokens
Cryptocurrency Exchange Binance to Stop Support for Stock Tokens
Bloomberg Bloomberg | Quint is a multiplatform, Indian business and financial news company. We combine Bloomberg’s global leadership in business and financial news and data, with Quintillion Media’s deep expertise in the Indian market and digital news delivery, to provide high quality business news, insights and trends for India’s sophisticated audiences. Customer Support Americas+1 212 318…
Tumblr media
View On WordPress
0 notes
stylebydaliadrake · 4 years ago
Photo
Tumblr media
SHE RIDE FOR HER for @footlockerwomen 💖💝 TEAM: Creative Director & Stylist: Dalia Drake @daliadrake Hair Stylist: Lurissa Lawson @lurissaingridhair Make-up: Sereena M. @krystalized.makeovers Model: Sinn Apsara @thesinncity Photographer: Steve Warren @_stevewarren Production Director: Ken Mizako @kenmizako WARDROBE: Showroom: Flying Solo @flyingsolonyc Dress - Designer: Elma Dawy @elmadawynewyork Sneakers: Coated Club C 💘 by @reebok & @iamcardib 💞 A huge TY to @alekza.latte 💓 #daliadrake #lifestyle #creativedirector #stylist #digitalcurator #ad #footlockerwomens #reebok #cardib #sneakerculture #readytowear #editorial #fashion #style #streetwear #highfashion #love #pinkaesthetic #explore #becausesneakers #housekicks #housefits #sneakerhead (at NYFW) https://www.instagram.com/p/CLe8uj1DyRC/?igshid=1eu9k4k7aw3hh
0 notes
dreibayern · 5 years ago
Photo
Tumblr media
#digitalcinema #digitalilustration #digitalcalligraphy #digitalplayground #digitalisation #digitalcommunications #digitalcreative #digitalanalytics #digitalunderground #digitalrudrabha #digitalcurency #digitalagencydubai https://www.instagram.com/p/CHB9XsapJoz/?igshid=1hc3pyu0wtje4
0 notes
southvision · 6 years ago
Photo
Tumblr media
Bronx Museum NY 22-051-19 #visualpactisioner #artivisit #curation #digitalcuration #artisttravel #NewYork #Newyorkbyday #broonklyn #thebronx #bronkxmuseum #bronxnation #usa #phonephotograpy #doogee #Johannesburg #gauteng #southafrica (at Bronx Museum of the Arts) https://www.instagram.com/p/ByCuiwDDxNn/?igshid=1opcbc3z5uih7
0 notes
kochstephane · 2 years ago
Text
Livre blanc “Enseigner et apprendre à l’ère de l’Intelligence Artificielle. Acculturation, intégration et usages créatifs de l’IA en éducation” –
See on Scoop.it - digitalcuration
Dans le cadre du Groupe thématique numérique (GTnum) #Scol_IA “Renouvellement des pratiques numériques et usages créatifs du numérique et IA” nous sommes heureux de vous présenter le livre blanc “Enseigner et apprendre à l’ère de l’intelligence artificielle. Acculturation, intégration et usages créatifs de l’intelligence artificielle en éducation”. Le livre blanc édité par Margarida Romero, Laurent…
6 notes · View notes
rbastien1234 · 3 years ago
Photo
Tumblr media
#vancouverartgallery #abstractpaintingart #installationart #digitalcurator #blockchain https://www.instagram.com/p/Ce8yRZOOg6E/?igshid=NGJjMDIxMWI=
0 notes
shreehq · 4 years ago
Photo
Tumblr media
Digital Asset is an alternative investment to STOCK AND SHARE. But, it is more secure with blockchain technology and the transaction can be done 24/7.
1 note · View note