Dave Ward d437d5105d Merged V4.0-BUG-FIX to HEAD
36311: BDE-69: filter long tests if minimal.testing property is defined
   36314: Merged V4.0 to V4.0-BUG-FIX (RECORD ONLY)
      36247: ALF-11027: temporarily remove import of maven.xml, since it makes ant calls fail from enterpriseprojects
   36331: ALF-12447: Further changes required to fix lower case meta-inf folder name
   36333: Revert ALF-12447.
   36334: ALF-14115: Merged V3.4-BUG-FIX to V4.0-BUG-FIX
      36318: ALF-12447: Fix case on META-INF folder for SDK
      36332: ALF-12447: Further changes required to fix lower case meta-inf folder name
   36337: ALF-14115: Merged V3.4-BUG-FIX to V4.0-BUG-FIX
      36332: ALF-12447: Yet more meta-inf case changes needed.
   36342: ALF-14120: fix only completed tasks returned
   36343: ALF-13898: starting workflow from IMAP now using workflowDefs with engine name included, fallback to appending $jbpm when not present, to preserve backwards compatibility.
   36345: Fix for ALF-12730 - Email Space Users fails if template is used
   36346: Fix for ALF-9466 - We can search contents sorted by categories in Advanced search in Share, but saved search will not be shown in UI.
   36364: Switch version to 4.0.3
   36375: Merged BRANCHES/DEV/CLOUDSYNCLOCAL2 to BRANCHES/DEV/V4.0-BUG-FIX:
      36366: Tweak to implementation to ensure that on-authentication-failed, the status is updated within a r/w transaction.
      36374: Provide more specific exceptions from the Remote Connector Service for client and server errors
   36376: Fix ALF-14121 - Alfresco fails to start if using "replicating-content-services-context.xml"
   36393: Final part of ALF-13723 SOLR does not include the same query unit tests as lucene
   - CMIS typed query and ordering tests
   36432: ALF-14133: Merged V3.4-BUG-FIX (3.4.10) to V4.0-BUG-FIX (4.0.3)
      << 4.0.x specific change: Changed transformer.complex.OOXML.Image into transformer.complex.Any.Image >>
      << allowing any transformer to be selected for the conversion to JPEG >>
      36427: ALF-14131 Complex transformers fail if a lower level transformer fails even though there is another transformer that could do the transformation
         - Added a base spring bean for all complex transformers
      36362: ALF-14131 Complex transformers fail if a lower level transformer fails even though there is another transformer that could do the transformation
   36434: Test fix for ALF-13723 SOLR does not include the same query unit tests as lucene
   - CMIS test data change broke AFTS ID ordering
   36503: Removed thousands of compiler warnings (CMIS query test code)
   36518: Fix for ALF-13778 - Links on Share Repository search page show incorrect link name; do not work when root-node is defined.
   Fix now means that Share search correctly handles overridden Repository root node setting. Original work by Vasily Olhin.
   36520: BDE-69: filter all repo tests if minimal.testing property is defined
   36534: ALF-14116: Latest Surf libs (r1075) - ensure that i18n extensions can process browser sent short locales
   36563: Merged V3.4-BUG-FIX to V4.0-BUG-FIX
      36336: ALF-12447: Yet more meta-inf case changes needed.
      36347: Fix for ALF-13920 - Error occurred when try to edit/delete category
      36352: Fix for ALF-13123 - Invalid JSON format from Get Node Tags Webscript - strings not double-quoted. Also fixed POST webscript with same issue.
      36399: ALL LANG: translation updates based on EN r36392
      36421: Fix for Mac Lion versioning issue. ALF-12792 (Part 1 of 2)
      Enable the InfoPassthru and Level2Oplocks server capability flags, InfoPassthru is the flag that fixes the Mac Lion versioning error.
      Added support for filesystems that do not implement the NTFS streams interface in the CIFS transact rename processing, for the Alfresco repo filesystem.
      36422: Fix for Mac Lion versioning issue. ALF-12792 (Part 2 of 2)
      Enable the InfoPassthru and Level2Oplocks server capability flags, InfoPassthru is the flag that fixes the Mac Lion versioning error.
      36423: Add support for file size tracking in the file state. ALF-13616 (Part 1 of 2)
      36424: Fix for Mac MS Word file save issue. ALF-13616 (Part 2 of 2)
      Added live file size tracking to file writing/folder searches so the correct file size is returned before the file is closed.
      36444: Merged DEV to V3.4-BUG-FIX
         36419: ALF-12666 Search against simple-search-additional-attributes doesn't work properly
            SearchContext.buildQuery(int) method was changed.
      36446: Fix for ALF-13404 - Performance: 'Content I'm Editing' dashlet is slow to render when there is lots of data/sites
       - Effectively removed all PATH based queries using the pattern /companyhome/sites/*/container//* as they are a non-optimized case
       - Replaced the "all sites" doclist query using the above pattern with /companyhome/sites//* plus post query resultset processing based on documentLibrary container matching regex
       - Optimized favorite document query to remove need for a PATH
       - Optimized Content I'm Editing discussion PATH query to use /*/* instead of /*//*
       - Fixed issue where Content I'm Editing discussion results would not always show the root topics that a user has edited
       - Added some addition doclist.get.js query scriptlogger debugging output
      36449: ALF-13404 - Fix for issue where favoriates for all sites would be shown in each site document library in the My Favorites filter.
      36475: ALF-14131 Complex transformers fail if a lower level transformer fails even though there is another transformer that could do the transformation
         - Change base spring bean on example config file
      36480: 36453: ALF-3881 : ldap sync deletion behaviour not flexible enough
         - synchronization.allowDeletions parameter introduced
         - default value is true (existing behaviour)
         - when false, no missing users or groups are deleted from the repository
         - instead they are cleared of their zones and missing groups are cleared of all their members
         - colliding users and groups from different zones are also 'moved' rather than recreated
         - unit test added
      36491: Added CIFS transact2 NT passthru levels for set end of file/set allocation size. ALF-13616.
      Also updated FileInfoLevel with the latest list of NT passthru information levels.
      36497: Fixed ALF-14163: JavaScript Behaviour broken: Node properties cannot be cast to java.io.Serializable
       - Fallout from ALF-12855
       - Made class Serializable (like HashMap would have been)
       - Fixed line endings, too
      36531: ALF-13769: Merged BELARUS/V3.4-BUG-FIX-2012_04_05 to V3.4-BUG-FIX (3.4.10)
         35150: ALF-2645 : 3.2+ ldap sync debug information is too scarce 
            - Improved LDAP logging.
      36532: ALF-13769: BRANCHES/DEV/BELARUS/V3.4-BUG-FIX-2012_01_26 to V3.4-BUG-FIX (3.4.10)
         36461: ALF-237: WCM: File conflicts cause file order not to be consistent
            - It is reasonable set values for checkboxes using the indexes from the list, which are not changed. So when we submit the window, the getSelectedNodes method is invoked and 
              it takes selected nodes by checkbox values from "paths" list. 
      36535: Merged DEV to V3.4-BUG-FIX
         36479: ALF-8918 : Cannot "edit offline" a web quick start publication
            A check in TaggableAspect.onUpdatePropertiesOnCommit() was extended to skip the update, if no tags were changed.
      36555: Merged V3.4 to V3.4-BUG-FIX
         36294: ALF-14039: Merged HEAD to V3.4
            31732: ALF-10934: Prevent potential start/stop ping-pong of subsystems across a cluster
               - When a cluster boots up or receives a reinit message it shouldn't be sending out any start messages
   36566: Merged V3.4-BUG-FIX to V4.0-BUG-FIX (RECORD ONLY)
      36172: Merged BRANCHES/DEV/V4.0-BUG-FIX to BRANCHES/DEV/V3.4-BUG-FIX:
         36169: ALF-8755: After renaming content / space by Contributor via WebDAV new items are created
   36572: Merged V4.0 to V4.0-BUG-FIX
      36388: ALF-14025: Updated Surf libs (1071). Fixes to checksum-disabled dependency handling
      36392: ALF-14129 Failed to do upgrade from 3.4.8 to 4.0.2
         << Committed change for Frederik Heremans >>
         - Moved actual activiti-tables creation to before the upgrade
      36409: Fix for ALF-14124 Solr is not working - Errors occur during the startup
      36466: Fix for ALF-12770 - Infinite loop popup alert in TinyMCE after XSS injection in Alfresco Explorer online edit.
      36501: Merged DEV to V4.0
         36496: ALF-14063 : CLONE - Internet Explorer hangs when using the object picker with a larger number of documents
            YUI 2.9.0 library was modified to use chunked unloading of listeners via a series of setTimeout() functions in event.js for IE 6,7,8.
      36502: ALF-14105: Share Advanced search issue with the form values
      - Fix by David We
      36538: ALF-13986: Updated web.xml and index.jsp redirect to ensure that SSO works with proper surf site-configuration customization
      36539: Fix for ALF-14167 Filtering by Tags/Categories doen't findes any content in Repository/DocumentLibrary
      - fix default namespace back to "" -> "" and fix the specific SOLR tests that require otherwise.
      36541: ALF-14082: Input stream leaks in thumbnail rendering webscripts
      36560: Correctly size content length header after HTML stripping process (ALF-9365)
   36574: Merged V4.0 to V4.0-BUG-FIX (RECORD ONLY)
      36316: Merged V4.0-BUG-FIX to V4.0 (4.0.2)
      36391: Merged V4.0-BUG-FIX to V4.0
         36376: Fix ALF-14121 - Alfresco fails to start if using "replicating-content-services-context.xml"


git-svn-id: https://svn.alfresco.com/repos/alfresco-enterprise/alfresco/HEAD/root@36576 c4b6b30b-aa2e-2d43-bbcb-ca4b014f7261
2012-05-18 17:00:53 +00:00

1039 lines
42 KiB
Java

/*
* Copyright (C) 2005-2012 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
*/
package org.alfresco.repo.content;
import java.io.File;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.Collection;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import org.alfresco.error.AlfrescoRuntimeException;
import org.alfresco.model.ContentModel;
import org.alfresco.repo.avm.AVMNodeConverter;
import org.alfresco.repo.content.ContentServicePolicies.OnContentPropertyUpdatePolicy;
import org.alfresco.repo.content.ContentServicePolicies.OnContentReadPolicy;
import org.alfresco.repo.content.ContentServicePolicies.OnContentUpdatePolicy;
import org.alfresco.repo.content.cleanup.EagerContentStoreCleaner;
import org.alfresco.repo.content.filestore.FileContentStore;
import org.alfresco.repo.content.filestore.FileContentWriter;
import org.alfresco.repo.content.transform.ContentTransformer;
import org.alfresco.repo.content.transform.ContentTransformerRegistry;
import org.alfresco.repo.content.transform.TransformerDebug;
import org.alfresco.repo.node.NodeServicePolicies;
import org.alfresco.repo.policy.ClassPolicyDelegate;
import org.alfresco.repo.policy.JavaBehaviour;
import org.alfresco.repo.policy.PolicyComponent;
import org.alfresco.repo.transaction.RetryingTransactionHelper;
import org.alfresco.service.cmr.avm.AVMService;
import org.alfresco.service.cmr.dictionary.DataTypeDefinition;
import org.alfresco.service.cmr.dictionary.DictionaryService;
import org.alfresco.service.cmr.dictionary.InvalidTypeException;
import org.alfresco.service.cmr.dictionary.PropertyDefinition;
import org.alfresco.service.cmr.repository.ContentData;
import org.alfresco.service.cmr.repository.ContentIOException;
import org.alfresco.service.cmr.repository.ContentReader;
import org.alfresco.service.cmr.repository.ContentService;
import org.alfresco.service.cmr.repository.ContentWriter;
import org.alfresco.service.cmr.repository.MimetypeService;
import org.alfresco.service.cmr.repository.NoTransformerException;
import org.alfresco.service.cmr.repository.NodeRef;
import org.alfresco.service.cmr.repository.NodeService;
import org.alfresco.service.cmr.repository.StoreRef;
import org.alfresco.service.cmr.repository.TransformationOptions;
import org.alfresco.service.cmr.usage.ContentQuotaException;
import org.alfresco.service.namespace.QName;
import org.alfresco.util.EqualsHelper;
import org.alfresco.util.Pair;
import org.alfresco.util.TempFileProvider;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.extensions.surf.util.I18NUtil;
/**
* Service implementation acting as a level of indirection between the client
* and the underlying content store.
* <p>
* Note: This class was formerly the {@link RoutingContentService} but the
* 'routing' functionality has been pushed into the {@link AbstractRoutingContentStore store}
* implementations.
*
* @author Derek Hulley
* @since 3.2
*/
public class ContentServiceImpl implements ContentService, ApplicationContextAware
{
private static Log logger = LogFactory.getLog(ContentServiceImpl.class);
private DictionaryService dictionaryService;
private NodeService nodeService;
private AVMService avmService;
private MimetypeService mimetypeService;
private RetryingTransactionHelper transactionHelper;
private ApplicationContext applicationContext;
protected TransformerDebug transformerDebug;
/** a registry of all available content transformers */
private ContentTransformerRegistry transformerRegistry;
/** The cleaner that will ensure that rollbacks clean up after themselves */
private EagerContentStoreCleaner eagerContentStoreCleaner;
/** the store to use. Any multi-store support is provided by the store implementation. */
private ContentStore store;
/** the store for all temporarily created content */
private ContentStore tempStore;
private ContentTransformer imageMagickContentTransformer;
/** Should we consider zero byte content to be the same as no content? */
private boolean ignoreEmptyContent;
private boolean transformerFailover;
/**
* The policy component
*/
private PolicyComponent policyComponent;
/*
* Policies delegates
*/
ClassPolicyDelegate<ContentServicePolicies.OnContentUpdatePolicy> onContentUpdateDelegate;
ClassPolicyDelegate<ContentServicePolicies.OnContentPropertyUpdatePolicy> onContentPropertyUpdateDelegate;
ClassPolicyDelegate<ContentServicePolicies.OnContentReadPolicy> onContentReadDelegate;
public void setRetryingTransactionHelper(RetryingTransactionHelper helper)
{
this.transactionHelper = helper;
}
public void setDictionaryService(DictionaryService dictionaryService)
{
this.dictionaryService = dictionaryService;
}
public void setNodeService(NodeService nodeService)
{
this.nodeService = nodeService;
}
public void setMimetypeService(MimetypeService mimetypeService)
{
this.mimetypeService = mimetypeService;
}
public void setTransformerRegistry(ContentTransformerRegistry transformerRegistry)
{
this.transformerRegistry = transformerRegistry;
}
public void setEagerContentStoreCleaner(EagerContentStoreCleaner eagerContentStoreCleaner)
{
this.eagerContentStoreCleaner = eagerContentStoreCleaner;
}
public void setStore(ContentStore store)
{
this.store = store;
}
public void setPolicyComponent(PolicyComponent policyComponent)
{
this.policyComponent = policyComponent;
}
public void setAvmService(AVMService service)
{
this.avmService = service;
}
public void setImageMagickContentTransformer(ContentTransformer imageMagickContentTransformer)
{
this.imageMagickContentTransformer = imageMagickContentTransformer;
}
public void setIgnoreEmptyContent(boolean ignoreEmptyContent)
{
this.ignoreEmptyContent = ignoreEmptyContent;
}
/**
* Allows fail over form one transformer to another when there is
* more than one transformer available. The cost is that the output
* of the transformer must go to a temporary file in case it fails.
* @param transformerFailover {@code true} indicate that fail over
* should take place.
*/
public void setTransformerFailover(boolean transformerFailover)
{
this.transformerFailover = transformerFailover;
}
/* (non-Javadoc)
* @see org.springframework.context.ApplicationContextAware#setApplicationContext(org.springframework.context.ApplicationContext)
*/
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException
{
this.applicationContext = applicationContext;
}
/**
* Helper setter of the transformer debug.
* @param transformerDebug
*/
public void setTransformerDebug(TransformerDebug transformerDebug)
{
this.transformerDebug = transformerDebug;
}
/**
* Service initialise
*/
public void init()
{
// Set up a temporary store
this.tempStore = new FileContentStore(this.applicationContext, TempFileProvider.getTempDir().getAbsolutePath());
// Bind on update properties behaviour
this.policyComponent.bindClassBehaviour(
NodeServicePolicies.OnUpdatePropertiesPolicy.QNAME,
this,
new JavaBehaviour(this, "onUpdateProperties"));
// Register on content update policy
this.onContentUpdateDelegate = this.policyComponent.registerClassPolicy(OnContentUpdatePolicy.class);
this.onContentPropertyUpdateDelegate = this.policyComponent.registerClassPolicy(OnContentPropertyUpdatePolicy.class);
this.onContentReadDelegate = this.policyComponent.registerClassPolicy(OnContentReadPolicy.class);
}
/**
* Update properties policy behaviour
*
* @param nodeRef the node reference
* @param before the before values of the properties
* @param after the after values of the properties
*/
public void onUpdateProperties(
NodeRef nodeRef,
Map<QName, Serializable> before,
Map<QName, Serializable> after)
{
// ALF-254: empty files (0 bytes) do not trigger content rules
if (nodeService.hasAspect(nodeRef, ContentModel.ASPECT_NO_CONTENT))
{
return;
}
// Don't duplicate work when firing multiple policies
Set<QName> types = null;
OnContentPropertyUpdatePolicy propertyPolicy = null; // Doesn't change for the node instance
// Variables to control firing of node-level policies (any content change)
boolean fire = false;
boolean isNewContent = false;
// check if any of the content properties have changed
for (QName propertyQName : after.keySet())
{
// is this a content property?
PropertyDefinition propertyDef = dictionaryService.getProperty(propertyQName);
if (propertyDef == null)
{
// the property is not recognised
continue;
}
else if (!propertyDef.getDataType().getName().equals(DataTypeDefinition.CONTENT))
{
// not a content type
continue;
}
else if (propertyDef.isMultiValued())
{
// We don't fire notifications for multi-valued content properties
continue;
}
try
{
ContentData beforeValue = (ContentData) before.get(propertyQName);
ContentData afterValue = (ContentData) after.get(propertyQName);
boolean hasContentBefore = ContentData.hasContent(beforeValue)
&& (!ignoreEmptyContent || beforeValue.getSize() > 0);
boolean hasContentAfter = ContentData.hasContent(afterValue)
&& (!ignoreEmptyContent || afterValue.getSize() > 0);
// There are some shortcuts here
if (!hasContentBefore && !hasContentAfter)
{
// Really, nothing happened
continue;
}
else if (EqualsHelper.nullSafeEquals(beforeValue, afterValue))
{
// Still, nothing happening
continue;
}
// Check for new content
isNewContent = !hasContentBefore && hasContentAfter;
// So debug ...
if (logger.isDebugEnabled())
{
String name = (String) nodeService.getProperty(nodeRef, ContentModel.PROP_NAME);
logger.debug(
"Content property updated: \n" +
" Node Name: " + name + "\n" +
" Property: " + propertyQName + "\n" +
" Is new: " + isNewContent + "\n" +
" Before: " + beforeValue + "\n" +
" After: " + afterValue);
}
// Fire specific policy
types = getTypes(nodeRef, types);
if (propertyPolicy == null)
{
propertyPolicy = onContentPropertyUpdateDelegate.get(nodeRef, types);
}
propertyPolicy.onContentPropertyUpdate(nodeRef, propertyQName, beforeValue, afterValue);
// We also fire an event if *any* content property is changed
fire = true;
}
catch (ClassCastException e)
{
// properties don't conform to model
continue;
}
}
// fire?
if (fire)
{
// Fire the content update policy
types = getTypes(nodeRef, types);
OnContentUpdatePolicy policy = onContentUpdateDelegate.get(nodeRef, types);
policy.onContentUpdate(nodeRef, isNewContent);
}
}
/**
* Helper method to lazily populate the types associated with a node
*
* @param nodeRef the node
* @param types any existing types
* @return the types - either newly populated or just what was passed in
*/
private Set<QName> getTypes(NodeRef nodeRef, Set<QName> types)
{
if (types != null)
{
return types;
}
types = new HashSet<QName>(this.nodeService.getAspects(nodeRef));
types.add(this.nodeService.getType(nodeRef));
return types;
}
@Override
public long getStoreFreeSpace()
{
return store.getSpaceFree();
}
@Override
public long getStoreTotalSpace()
{
return store.getSpaceTotal();
}
/** {@inheritDoc} */
public ContentReader getRawReader(String contentUrl)
{
ContentReader reader = null;
try
{
reader = store.getReader(contentUrl);
}
catch (UnsupportedContentUrlException e)
{
// The URL is not supported, so we spoof it
reader = new EmptyContentReader(contentUrl);
}
if (reader == null)
{
throw new AlfrescoRuntimeException("ContentStore implementations may not return null ContentReaders");
}
// set extra data on the reader
reader.setMimetype(MimetypeMap.MIMETYPE_BINARY);
reader.setEncoding("UTF-8");
reader.setLocale(I18NUtil.getLocale());
// Done
if (logger.isDebugEnabled())
{
logger.debug(
"Direct request for reader: \n" +
" Content URL: " + contentUrl + "\n" +
" Reader: " + reader);
}
return reader;
}
public ContentReader getReader(NodeRef nodeRef, QName propertyQName)
{
return getReader(nodeRef, propertyQName, true);
}
@SuppressWarnings("unchecked")
private ContentReader getReader(NodeRef nodeRef, QName propertyQName, boolean fireContentReadPolicy)
{
ContentData contentData = null;
Serializable propValue = nodeService.getProperty(nodeRef, propertyQName);
if (propValue instanceof Collection)
{
Collection<Serializable> colPropValue = (Collection<Serializable>)propValue;
if (colPropValue.size() > 0)
{
propValue = colPropValue.iterator().next();
}
}
if (propValue instanceof ContentData)
{
contentData = (ContentData)propValue;
}
if (contentData == null)
{
PropertyDefinition contentPropDef = dictionaryService.getProperty(propertyQName);
// if no value or a value other content, and a property definition has been provided, ensure that it's CONTENT or ANY
if (contentPropDef != null &&
(!(contentPropDef.getDataType().getName().equals(DataTypeDefinition.CONTENT) ||
contentPropDef.getDataType().getName().equals(DataTypeDefinition.ANY))))
{
throw new InvalidTypeException("The node property must be of type content: \n" +
" node: " + nodeRef + "\n" +
" property name: " + propertyQName + "\n" +
" property type: " + ((contentPropDef == null) ? "unknown" : contentPropDef.getDataType()),
propertyQName);
}
}
// check that the URL is available
if (contentData == null || contentData.getContentUrl() == null)
{
// there is no URL - the interface specifies that this is not an error condition
return null;
}
String contentUrl = contentData.getContentUrl();
// The context of the read is entirely described by the URL
ContentReader reader = store.getReader(contentUrl);
if (reader == null)
{
throw new AlfrescoRuntimeException("ContentStore implementations may not return null ContentReaders");
}
// set extra data on the reader
reader.setMimetype(contentData.getMimetype());
reader.setEncoding(contentData.getEncoding());
reader.setLocale(contentData.getLocale());
// Fire the content read policy
if (reader != null && fireContentReadPolicy == true)
{
// Fire the content update policy
Set<QName> types = new HashSet<QName>(this.nodeService.getAspects(nodeRef));
types.add(this.nodeService.getType(nodeRef));
OnContentReadPolicy policy = this.onContentReadDelegate.get(nodeRef, types);
policy.onContentRead(nodeRef);
}
// we don't listen for anything
// result may be null - but interface contract says we may return null
return reader;
}
public ContentWriter getWriter(NodeRef nodeRef, QName propertyQName, boolean update)
{
if (nodeRef == null)
{
ContentContext ctx = new ContentContext(null, null);
// for this case, we just give back a valid URL into the content store
ContentWriter writer = store.getWriter(ctx);
// Register the new URL for rollback cleanup
eagerContentStoreCleaner.registerNewContentUrl(writer.getContentUrl());
// done
return writer;
}
// check for an existing URL - the get of the reader will perform type checking
ContentReader existingContentReader = getReader(nodeRef, propertyQName, false);
// get the content using the (potentially) existing content - the new content
// can be wherever the store decides.
ContentContext ctx = new NodeContentContext(existingContentReader, null, nodeRef, propertyQName);
ContentWriter writer = store.getWriter(ctx);
// Register the new URL for rollback cleanup
eagerContentStoreCleaner.registerNewContentUrl(writer.getContentUrl());
// Special case for AVM repository.
Serializable contentValue = null;
if (nodeRef.getStoreRef().getProtocol().equals(StoreRef.PROTOCOL_AVM))
{
Pair<Integer, String> avmVersionPath = AVMNodeConverter.ToAVMVersionPath(nodeRef);
contentValue = avmService.getContentDataForWrite(avmVersionPath.getSecond());
}
else
{
contentValue = nodeService.getProperty(nodeRef, propertyQName);
}
// set extra data on the reader if the property is pre-existing
if (contentValue != null && contentValue instanceof ContentData)
{
ContentData contentData = (ContentData)contentValue;
writer.setMimetype(contentData.getMimetype());
writer.setEncoding(contentData.getEncoding());
writer.setLocale(contentData.getLocale());
}
// attach a listener if required
if (update)
{
// need a listener to update the node when the stream closes
WriteStreamListener listener = new WriteStreamListener(nodeService, nodeRef, propertyQName, writer);
listener.setRetryingTransactionHelper(transactionHelper);
writer.addListener(listener);
}
// supply the writer with a copy of the mimetype service if needed
if (writer instanceof AbstractContentWriter)
{
((AbstractContentWriter)writer).setMimetypeService(mimetypeService);
}
// give back to the client
return writer;
}
/**
* @return Returns a writer to an anonymous location
*/
public ContentWriter getTempWriter()
{
// there is no existing content and we don't specify the location of the new content
return tempStore.getWriter(ContentContext.NULL_CONTEXT);
}
/**
* @see org.alfresco.repo.content.transform.ContentTransformerRegistry
* @see org.alfresco.repo.content.transform.ContentTransformer
* @see org.alfresco.service.cmr.repository.ContentService#transform(org.alfresco.service.cmr.repository.ContentReader, org.alfresco.service.cmr.repository.ContentWriter)
*/
public void transform(ContentReader reader, ContentWriter writer)
{
// Call transform with no options
TransformationOptions options = new TransformationOptions();
this.transform(reader, writer, options);
}
/**
* @see org.alfresco.repo.content.transform.ContentTransformerRegistry
* @see org.alfresco.repo.content.transform.ContentTransformer
* @deprecated
*/
public void transform(ContentReader reader, ContentWriter writer, Map<String, Object> options)
throws NoTransformerException, ContentIOException
{
transform(reader, writer, new TransformationOptions(options));
}
/**
* @see org.alfresco.repo.content.transform.ContentTransformerRegistry
* @see org.alfresco.repo.content.transform.ContentTransformer
*/
public void transform(ContentReader reader, ContentWriter writer, TransformationOptions options)
throws NoTransformerException, ContentIOException
{
// check that source and target mimetypes are available
if (reader == null)
{
throw new AlfrescoRuntimeException("The content reader must be set");
}
String sourceMimetype = reader.getMimetype();
if (sourceMimetype == null)
{
throw new AlfrescoRuntimeException("The content reader mimetype must be set: " + reader);
}
String targetMimetype = writer.getMimetype();
if (targetMimetype == null)
{
throw new AlfrescoRuntimeException("The content writer mimetype must be set: " + writer);
}
long sourceSize = reader.getSize();
try
{
// look for a transformer
transformerDebug.pushAvailable(reader.getContentUrl(), sourceMimetype, targetMimetype, options);
List<ContentTransformer> transformers = getActiveTransformers(sourceMimetype, sourceSize, targetMimetype, options);
transformerDebug.availableTransformers(transformers, sourceSize, "ContentService.transform(...)");
int count = transformers.size();
if (count == 0)
{
throw new NoTransformerException(sourceMimetype, targetMimetype);
}
if (count == 1 || !transformerFailover)
{
ContentTransformer transformer = transformers.size() == 0 ? null : transformers.get(0);
transformer.transform(reader, writer, options);
}
else
{
failoverTransformers(reader, writer, options, targetMimetype, transformers);
}
}
finally
{
if (transformerDebug.isEnabled())
{
transformerDebug.popAvailable();
debugActiveTransformers(sourceMimetype, targetMimetype, sourceSize, options);
}
}
}
private void failoverTransformers(ContentReader reader, ContentWriter writer,
TransformationOptions options, String targetMimetype,
List<ContentTransformer> transformers)
{
List<AlfrescoRuntimeException> exceptions = null;
boolean done = false;
try
{
// Try the best transformer and then the next if it fails
// and so on down the list
char c = 'a';
String outputFileExt = mimetypeService.getExtension(targetMimetype);
for (ContentTransformer transformer : transformers)
{
ContentWriter currentWriter = writer;
File tempFile = null;
try
{
// We can't know in advance which of the
// available transformer will work - if any.
// We can't write into the ContentWriter stream.
// So make a temporary file writer with the
// current transformer name.
tempFile = TempFileProvider.createTempFile(
"FailoverTransformer_intermediate_"
+ transformer.getClass().getSimpleName() + "_", "."
+ outputFileExt);
currentWriter = new FileContentWriter(tempFile);
currentWriter.setMimetype(targetMimetype);
currentWriter.setEncoding(writer.getEncoding());
if (c != 'a' && transformerDebug.isEnabled())
{
transformerDebug.debug("");
transformerDebug.debug("Try " + c + ")");
}
c++;
transformer.transform(reader, currentWriter, options);
if (tempFile != null)
{
writer.putContent(tempFile);
}
// No need to close input or output streams
// (according
// to comment in FailoverContentTransformer)
done = true;
return;
}
catch (AlfrescoRuntimeException e)
{
if (exceptions == null)
{
exceptions = new ArrayList<AlfrescoRuntimeException>();
}
exceptions.add(e);
// Set a new reader to refresh the input stream.
reader = reader.getReader();
}
}
// Throw the exception from the first transformer. The
// others are consumed.
if (exceptions != null)
{
throw exceptions.get(0);
}
}
finally
{
// Log exceptions that we have consumed. We may have thrown the first one if
// none of the transformers worked.
if (exceptions != null)
{
boolean first = true;
for (Exception e : exceptions)
{
if (done)
{
logger.error("Transformer succeeded after previous transformer failed.", e);
}
else if (!first)
{
logger.error("Transformer exception.", e);
first = false;
}
}
}
}
}
/**
* @see org.alfresco.repo.content.transform.ContentTransformerRegistry
* @see org.alfresco.repo.content.transform.ContentTransformer
*/
public ContentTransformer getTransformer(String sourceMimetype, String targetMimetype)
{
return getTransformer(null, sourceMimetype, -1, targetMimetype, new TransformationOptions());
}
public ContentTransformer getTransformer(String sourceMimetype, String targetMimetype, TransformationOptions options)
{
return getTransformer(null, sourceMimetype, -1, targetMimetype, options);
}
/**
* @see org.alfresco.service.cmr.repository.ContentService#getTransformer(String, java.lang.String, long, java.lang.String, org.alfresco.service.cmr.repository.TransformationOptions)
*/
public ContentTransformer getTransformer(String sourceUrl, String sourceMimetype, long sourceSize, String targetMimetype, TransformationOptions options)
{
List<ContentTransformer> transformers = getTransformers(sourceUrl, sourceMimetype, sourceSize, targetMimetype, options);
return (transformers == null) ? null : transformers.get(0);
}
/**
* @see org.alfresco.service.cmr.repository.ContentService#getTransformers(String, java.lang.String, long, java.lang.String, org.alfresco.service.cmr.repository.TransformationOptions)
*/
public List<ContentTransformer> getTransformers(String sourceUrl, String sourceMimetype, long sourceSize, String targetMimetype, TransformationOptions options)
{
try
{
// look for a transformer
transformerDebug.pushAvailable(sourceUrl, sourceMimetype, targetMimetype, options);
List<ContentTransformer> transformers = getActiveTransformers(sourceMimetype, sourceSize, targetMimetype, options);
transformerDebug.availableTransformers(transformers, sourceSize, "ContentService.getTransformer(...)");
return transformers.isEmpty() ? null : transformers;
}
finally
{
transformerDebug.popAvailable();
}
}
/**
* Checks if the file just uploaded into Share is a special "debugTransformers.txt" file and
* if it is creates TransformerDebug that lists all the supported mimetype transformation for
* each transformer.
*/
private void debugActiveTransformers(String sourceMimetype, String targetMimetype,
long sourceSize, TransformationOptions transformOptions)
{
// check the file name, but do faster tests first
if (sourceSize == 18 &&
MimetypeMap.MIMETYPE_TEXT_PLAIN.equals(sourceMimetype) &&
MimetypeMap.MIMETYPE_IMAGE_PNG.equals(targetMimetype) &&
"debugTransformers.txt".equals(transformerDebug.getFileName(transformOptions, true, 0)))
{
debugActiveTransformers();
}
}
/**
* Creates TransformerDebug that lists all the supported mimetype transformation for each transformer.
*/
private void debugActiveTransformers()
{
try
{
transformerDebug.pushMisc();
transformerDebug.debug("Active and inactive transformers");
TransformationOptions options = new TransformationOptions();
Map<String, Set<String>> explicitTransforms = debugExplicitTransforms();
for (ContentTransformer transformer: transformerRegistry.getTransformers())
{
try
{
transformerDebug.pushMisc();
int mimetypePairCount = 0;
boolean first = true;
for (String sourceMimetype : mimetypeService.getMimetypes())
{
for (String targetMimetype : mimetypeService.getMimetypes())
{
if (transformer.isTransformable(sourceMimetype, -1, targetMimetype, options))
{
long maxSourceSizeKBytes = transformer.getMaxSourceSizeKBytes(
sourceMimetype, targetMimetype, options);
// Is this an explicit transform, ignored because there are explicit transforms
// or does not have explicit transforms.
Boolean explicit = transformer.isExplicitTransformation(sourceMimetype,
targetMimetype, options);
if (!explicit)
{
Set<String> targetMimetypes = explicitTransforms.get(sourceMimetype);
explicit = (targetMimetypes == null || !targetMimetypes.contains(targetMimetype))
? null
: Boolean.FALSE;
}
transformerDebug.activeTransformer(++mimetypePairCount, transformer,
sourceMimetype, targetMimetype, maxSourceSizeKBytes, explicit, first);
first = false;
}
}
}
if (first)
{
transformerDebug.inactiveTransformer(transformer);
}
}
finally
{
transformerDebug.popMisc();
}
}
}
finally
{
transformerDebug.popMisc();
}
}
/**
* Returns the explicit mimetype transformations. Key is the source mimetype
* and the value is a set of target mimetypes that are explicit.
*/
private Map<String, Set<String>> debugExplicitTransforms()
{
Map<String, Set<String>> explicitTransforms = new HashMap<String, Set<String>>();
TransformationOptions options = new TransformationOptions();
for (String sourceMimetype : mimetypeService.getMimetypes())
{
for (String targetMimetype : mimetypeService.getMimetypes())
{
for (ContentTransformer transformer : transformerRegistry.getTransformers())
{
if (transformer.isTransformable(sourceMimetype, -1, targetMimetype, options))
{
if (transformer.isExplicitTransformation(sourceMimetype, targetMimetype,
options))
{
Set<String> targetMimetypes = explicitTransforms.get(sourceMimetype);
if (targetMimetypes == null)
{
targetMimetypes = new HashSet<String>();
explicitTransforms.put(sourceMimetype, targetMimetypes);
}
targetMimetypes.add(targetMimetype);
break;
}
}
}
}
}
return explicitTransforms;
}
/**
* {@inheritDoc}
*/
public long getMaxSourceSizeBytes(String sourceMimetype, String targetMimetype, TransformationOptions options)
{
try
{
long maxSourceSize = 0;
transformerDebug.pushAvailable(null, sourceMimetype, targetMimetype, options);
List<ContentTransformer> transformers = getActiveTransformers(sourceMimetype, -1, targetMimetype, options);
for (ContentTransformer transformer: transformers)
{
long maxSourceSizeKBytes = transformer.getMaxSourceSizeKBytes(sourceMimetype, targetMimetype, options);
if (maxSourceSize >= 0)
{
if (maxSourceSizeKBytes < 0)
{
maxSourceSize = -1;
}
else if (maxSourceSizeKBytes > 0 && maxSourceSize < maxSourceSizeKBytes)
{
maxSourceSize = maxSourceSizeKBytes;
}
}
// if maxSourceSizeKBytes == 0 this implies the transformation is disabled
}
if (transformerDebug.isEnabled())
{
transformerDebug.availableTransformers(transformers, -1,
"ContentService.getMaxSourceSizeBytes() = "+transformerDebug.fileSize(maxSourceSize*1024));
}
return (maxSourceSize > 0) ? maxSourceSize * 1024 : maxSourceSize;
}
finally
{
transformerDebug.popAvailable();
}
}
public List<ContentTransformer> getActiveTransformers(String sourceMimetype, String targetMimetype, TransformationOptions options)
{
return getActiveTransformers(sourceMimetype, -1, targetMimetype, options);
}
public List<ContentTransformer> getActiveTransformers(String sourceMimetype, long sourceSize, String targetMimetype, TransformationOptions options)
{
return transformerRegistry.getActiveTransformers(sourceMimetype, sourceSize, targetMimetype, options);
}
/**
* @see org.alfresco.service.cmr.repository.ContentService#getImageTransformer()
*/
public ContentTransformer getImageTransformer()
{
return imageMagickContentTransformer;
}
/**
* @see org.alfresco.repo.content.transform.ContentTransformerRegistry
* @see org.alfresco.repo.content.transform.ContentTransformer
*/
public boolean isTransformable(ContentReader reader, ContentWriter writer)
{
return isTransformable(reader, writer, new TransformationOptions());
}
/**
* @see org.alfresco.service.cmr.repository.ContentService#isTransformable(org.alfresco.service.cmr.repository.ContentReader, org.alfresco.service.cmr.repository.ContentWriter, org.alfresco.service.cmr.repository.TransformationOptions)
*/
public boolean isTransformable(ContentReader reader, ContentWriter writer, TransformationOptions options)
{
// check that source and target mimetypes are available
String sourceMimetype = reader.getMimetype();
if (sourceMimetype == null)
{
throw new AlfrescoRuntimeException("The content reader mimetype must be set: " + reader);
}
String targetMimetype = writer.getMimetype();
if (targetMimetype == null)
{
throw new AlfrescoRuntimeException("The content writer mimetype must be set: " + writer);
}
// look for a transformer
ContentTransformer transformer = transformerRegistry.getTransformer(sourceMimetype, reader.getSize(), targetMimetype, options);
return (transformer != null);
}
/**
* Ensures that, upon closure of the output stream, the node is updated with
* the latest URL of the content to which it refers.
* <p>
*
* @author Derek Hulley
*/
private static class WriteStreamListener extends AbstractContentStreamListener
{
private NodeService nodeService;
private NodeRef nodeRef;
private QName propertyQName;
private ContentWriter writer;
public WriteStreamListener(
NodeService nodeService,
NodeRef nodeRef,
QName propertyQName,
ContentWriter writer)
{
this.nodeService = nodeService;
this.nodeRef = nodeRef;
this.propertyQName = propertyQName;
this.writer = writer;
}
public void contentStreamClosedImpl() throws ContentIOException
{
try
{
// set the full content property
ContentData contentData = writer.getContentData();
// Bypass NodeService for avm stores.
if (nodeRef.getStoreRef().getProtocol().equals(StoreRef.PROTOCOL_AVM))
{
nodeService.setProperty(nodeRef, ContentModel.PROP_CONTENT, contentData);
}
else
{
nodeService.setProperty(
nodeRef,
propertyQName,
contentData);
}
// done
if (logger.isDebugEnabled())
{
logger.debug("Stream listener updated node: \n" +
" node: " + nodeRef + "\n" +
" property: " + propertyQName + "\n" +
" value: " + contentData);
}
}
catch (ContentQuotaException qe)
{
throw qe;
}
catch (Throwable e)
{
throw new ContentIOException("Failed to set content property on stream closure: \n" +
" node: " + nodeRef + "\n" +
" property: " + propertyQName + "\n" +
" writer: " + writer + "\n" +
e.toString(),
e);
}
}
}
}