alfresco-community-repo/source/java/org/alfresco/repo/transfer/RepoPrimaryManifestProcessorImpl.java
Dave Ward a5f31cd37e Merged V3.3 to HEAD
20167: Merged HEAD to BRANCHES/V3.3: (RECORD ONLY)
      20166: Fix ALF-2765: Renditions created via 3.3 RenditionService are not exposed via OpenCMIS rendition API
   20232: Fix problem opening AVM web project folders via FTP. ALF-2738.
   20234: ALF-2352: Cannot create folders in Share doclib without admin user in authentication chain
   20235: Fix for unable to create folders in web project via CIFS. ALF-2736.
   20258: Reverse-merged rev 20254: 'When dropping the mysql database ...'
   20262: Merged V3.3-BUG-FIX to V3.3
      20251: Fix for ALF-2804 - Unable to browse into folders in Share Site in certain situations.
              - Browser history filter object in incorrect state after page refresh.
   20264: Updated Oracle build support (to fix grants)
   20282: Merged PATCHES/V3.2.0 to V3.3
      20266: Test reproduction of ALF-2839 failure: Node pre-loading generates needless resultset rows
      20280: Fixed ALF-2839: Node pre-loading generates needless resultset rows
   20283: Merged BRANCHES/DEV/V3.3-BUG-FIX to BRANCHES/V3.3:
      20194: AVMTestSuite - scale down unit tests (slightly)
      20247: AVMServiceTest.testVersionByDate - build (add delay)
   20290: Fixed ALF-2851 "Drag n Drop issues in IE6 & IE7"
      - Reordering rules-list with drag and drop didn't work at all because each rule was created using a template that had the "id"-attribute set, which made IE confused after using HTMLELement.clone() even though the id was resetted
      - Both customise-dashlets & rules-list got an error when "throwing" away the dashlet or rule instead of releasing it "carefully", reason was becuuase IE didnt capture the x:y-position which made the animation fail. Now no animation is done if x:y isn't found.
   20296: Merged PATCHES/V3.1.0 to V3.3 (RECORD ONLY)
      20249: Merged V3.1 to PATCHES/V3.1.0
         14565: Updated version to include revision number (x.y.z)
      20246: Merged V3.1 to PATCHES/V3.1.0
         13841: Build fix
      20245: Merged V3.1 to PATCHES/V3.1.0
         16185: AbstractLuceneIndexerAndSearcherFactory.getTransactionId() must return null when there is no transaction
      20241: Merged V3.1 to PATCHES/V3.1.0
         14187: Fix for ETHREEOH-2023: LDAP import must lower case the local name of the association to person.
         16167: ETHREEOH-2475: Fixed nested transaction handling in AbstractLuceneIndexerAndSearcherFactory to allow duplicate user processing in PersonServiceImpl to actually work
         16168: ETHREEOH-2797: Force patch.db-V2.2-Person to apply one more time to fix up corrupt users created by LDAP Import
            - Problem due to ETHREEOH-2023, fixed in 3.1.1
            - Also corrects ldap.synchronisation.defaultHomeFolderProvider to be userHomesHomeFolderProvider
            - Also requires fix to ETHREEOH-2475 to fix up duplicate users
      20221:Merged PATCHES/V3.1.2 to PATCHES/V3.1.0
         20217: Merged PATCHES/V3.2.0 to PATCHES/V3.1.2
            19793: Merged HEAD to V3.2.0
               19786: Refactor of previous test fix. I have pushed down the OOo-specific parts of the change from AbstractContentTransformerTest to OpenOfficeContentTransformerTest leaving an extension point in the base class should other transformations need to be excluded in the future.
               19785: Fix for failing test OpenOfficeContentTransformerTest.testAllConversions.
                  Various OOo-related transformations are returned as available but fail on our test server with OOo on it.
                  Pending further work on these failings, I am disabling those transformations in test code whilst leaving them available in the product code. This is because in the wild a different OOo version may succeed with these transformations.
                  I had previously explicitly disabled 3 transformations in the product and I am moving that restriction from product to test code for the same reason.
               19707: Return value from isTransformationBlocked was inverted. Fixed now.
               19705: Refinement of previous check-in re OOo transformations.
                  I have pulled up the code that handles blocked transformations into a superclass so that the JodConverter-based transformer worker can inherit the same list of blocked transformations. To reiterate, blocked transformations are those that the OOo integration code believes should work but which are broken in practice. These are blocked by the transformers and will always be unavailable regardless of the OOo connection state.
               19702: Fix for HEAD builds running on panda build server.
                  OOo was recently installed on panda which has activated various OOo-related transformations/extractions in the test code.
                  It appears that OOo does not support some transformations from Office 97 to Office 2007. Specifically doc to docx and xls to xlsx. These transformations have now been marked as unavailable.
      20220: Created hotfix branch off TAGS/ENTERPRISE/V3.1.0
   20297: Merged PATCHES/V3.1.2 to V3.3 (RECORD ONLY)
      20268: Increment version number
      20267: ALF-550: Merged V3.2 to PATCHES/V3.1.2
         17768: Merged DEV/BELARUS/V3.2-2009_11_24 to V3.2
            17758: ETHREEOH-3757: Oracle upgrade issue: failed "inviteEmailTemplate" patch - also causes subsequent patches to not be applied
      20217: Merged PATCHES/V3.2.0 to PATCHES/V3.1.2
         19793: Merged HEAD to V3.2.0
            19786: Refactor of previous test fix. I have pushed down the OOo-specific parts of the change from AbstractContentTransformerTest to OpenOfficeContentTransformerTest leaving an extension point in the base class should other transformations need to be excluded in the future.
            19785: Fix for failing test OpenOfficeContentTransformerTest.testAllConversions.
               Various OOo-related transformations are returned as available but fail on our test server with OOo on it.
               Pending further work on these failings, I am disabling those transformations in test code whilst leaving them available in the product code. This is because in the wild a different OOo version may succeed with these transformations.
               I had previously explicitly disabled 3 transformations in the product and I am moving that restriction from product to test code for the same reason.
            19707: Return value from isTransformationBlocked was inverted. Fixed now.
            19705: Refinement of previous check-in re OOo transformations.
               I have pulled up the code that handles blocked transformations into a superclass so that the JodConverter-based transformer worker can inherit the same list of blocked transformations. To reiterate, blocked transformations are those that the OOo integration code believes should work but which are broken in practice. These are blocked by the transformers and will always be unavailable regardless of the OOo connection state.
            19702: Fix for HEAD builds running on panda build server.
               OOo was recently installed on panda which has activated various OOo-related transformations/extractions in the test code.
               It appears that OOo does not support some transformations from Office 97 to Office 2007. Specifically doc to docx and xls to xlsx. These transformations have now been marked as unavailable.
      20204: Moved version label to '.6'
   20298: Merged PATCHES/V3.2.0 to V3.3 (RECORD ONLY)
      20281: Incremented version number to '10'
      20272: Backports to help fix ALF-2839: Node pre-loading generates needless resultset rows
         Merged BRANCHES/V3.2 to PATCHES/V3.2.0:
            18490: Added cache for alf_content_data
         Merged BRANCHES/DEV/V3.3-BUG-FIX to PATCHES/V3.2.0:
            20231: Fixed ALF-2784: Degradation of performance between 3.1.1 and 3.2x (observed in JSF)
   20299: Merged PATCHES/V3.2.1 to V3.3 (RECORD ONLY)
      20279: Incremented version label
      20211: Reinstated patch 'patch.convertContentUrls' (reversed rev 20205 ALF-2719)
      20210: Incremented version label to '.3'
      20206: Bumped version label to '.2'
      20205: Workaround for ALF-2719 by disabling patch.convertContentUrls and ContentStoreCleaner
      20149: Incremented version label
      20101: Created hotfix branch off ENTERPRISE/V3.2.1
   20300: Merged BRANCHES/DEV/BELARUS/HEAD-2010_04_28 to BRANCHES/V3.3:
      20293: ALF-767: remove-AVM-issuer.sql upgrade does not account for column (mis-)order - fixed for MySQL, PostgreSQL and Oracle (DB2 & MS SQL Server already OK)
   20301: Merged PATCHES/V3.2.1 to V3.3
      20278: ALF-206: Make it possible to follow hyperlinks to document JSF client URLs from MS Office
         - A request parameter rather than a (potentially forgotten) session attribute is used to propagate the URL to redirect to after successful login
   20303: Fixed ALF-2855: FixAuthorityCrcValuesPatch reports NPE during upgrade from 2.1.7 to 3.3E
      - Auto-unbox NPE on Long->long: Just used the Long directly for reporting
   20319: Fixed ALF-2854: User Usage Queries use read-write methods on QNameDAO
   20322: Fixed ALF-1998: contentStoreCleanerJob leads to foreign key exception
      - Possible concurrent modification of alf_content_url.orphan_time led to false orphan detection
      - Fixed queries to check for dereferencing AND use the indexed orphan_time column
      - More robust use of EagerContentStoreCleaner: On eager cleanup, ensure that URLs are deleted
      - Added optimistic lock checks on updates and deletes of alf_content_url
   20335: Merged DEV/V3.3-BUG-FIX to V3.3
      20334: ALF-2473: Changes for clean startup and shutdown of subsystems on Spring 3
         - Removed previous SafeEventPublisher workaround for startup errors and associated changes
         - Replaced with SafeApplicationEventMulticaster which queues up events while an application context isn't started
         - Now all subsystems shut down cleanly
         - Fixes problem with FileContentStore visibility in JMX too!
   20341: ALF-2517 Quick fix which means rules which compare the creation/modification date of content should now correctly be applied when content is uploaded to a folder.
   20346: ALF-2839: Node pre-loading generates needless resultset rows
      - Added missing Criteria.list() call
   20347: Merged BRANCHES/DEV/V3.3-BUG-FIX to BRANCHES/V3.3:
      20231: Fixed ALF-2784: Degradation of performance between 3.1.1 and 3.2x (observed in JSF)
   20356: Merged DEV/BELARUS/HEAD-2010_03_30 to V3.3 (with corrections)
      19735: ALF-686: Alfresco cannot start if read/write mode in Sysadmin subsystem is configured
         1. org.alfresco.repo.module.ModuleComponentHelper was modified to allow “System” user run write operations in read-only system.
         2. Startup of “Synchronization” subsystem failed with the same error as was occurred in issue during modules start. org.alfresco.repo.security.sync.ChainingUserRegistrySynchronizer was also modified to allow “System” user run write operations in read-only mode.
   20361: Merged HEAD to BRANCHES/V3.3: (RECORD ONLY)
      20345: Fix ALF-2319: CMIS 'current' version mapping is not compliant with spec
      20354: Update test to reflect changes to CMIS version mapping.
   20363: Merge from V3.2 to V3.2 (all record-only)
      c. 19448 OOoJodConverter worker bean correctly handles isAvailable() when subsystem is disabled.
      c. 19484 JodConverter-backed thumbnailing test now explicitly sets OOoDirect and OOoJodconverter enabled-ness back to default settings in tearDown
      c. 20175 Fix for ALF-2773 JMX configuration of enterprise logging broken
   20376: Altered URL of online help to point at http://www.alfresco.com/help/33/enterprise/webeditor/
   20395: set google docs off
   20398: Fixed ALF-2890: Upgrade removes content if transaction retries are triggered
      - Setting ContentData that was derived outside of the current transaction opened up a window
        for the post-rollback code to delete the underlying binary. The binaries are only registered
        for writers fetched via the ContentService now; the low-level DAO no longer does management
        because it can't assume that a new content URL indicates a new underlying binary.
      - The contentUrlConverter was creating new URLs and thus the low-level DAO cleaned up
        live content when retrying collisions took place. The cleanup is no longer on the stack
        for the patch.
      - Removes the ALF-558 changes around ContentData.reference()
   20399: Remove googledocs aspect option
   20400: PurgeTestP (AVM) - increase wait cycles
   20422: Added ooo converter properties
   20425: Merge V3.3-BUG-FIX to V3.3
      20392 : ALF-2716 - imap mail metadata extraction fails when alfresco server locale is non English
      20365 : Merge DEV to V3.3-BUG_FIX     
         18011 : ETHREEOH-3804 - IMAP message body doesn't appears in IMAP folder when message subject is equal to the attachment name
      20332 : Build fix - rework to the ImapServiceUnit tests.
      20325 : build fix
      20318 : MERGE DEV TO V3.3-BUG-FIX    
         20287 : ALF-2754: Alfresco IMAP and Zimbra Desktop Client.
      20317 : ALF-2716 - imap mail metadata extraction fails when alfresco server locale is non English   This change reworks the received date metadata extraction.
      20316 : ALF-1912 : Problem with IMAP Sites visibility   Now only IMAP favouries are shown.   Also major rework to the way that this service uses the FileFolderService.
      20315 : ALF-1912 Updates to the FileFolderService to support the Imap Service    - add listDeepFolders    - remove "makeFolders" which moves to its own Utility class.    - update to JavaDoc
   20429: Merged BRANCHES/DEV/V3.3-BUG-FIX to BRANCHES/V3.3:
      20171: 3.3SP1 bug fix branch
      20174: Fix for ALF-960 and ALFCOM-1980: WCM - File Picker Restriction relative to folder not web project
      20179: ALF-2629 Now when a workflow timer signals a transition it also ends the associated task.
   20433: Merged BRANCHES/DEV/V3.3-BUG-FIX to BRANCHES/V3.3:
      20184: ALF-2772: Added new test case to RepoTransferReceiverImplTest and fixed the fault in the primary manifest processor.
      20196: Temporary fix to SandboxServiceImplTest, which reverses the fix to ALF-2529.
   20434: Merged BRANCHES/DEV/V3.3-BUG-FIX to BRANCHES/V3.3: (RECORD ONLY)
      20213: (RECORD ONLY) Merge from V3.3 to V3.3-BUG-FIX
         r20176 Merge from V3.2 to V3.3.
             r20175. JMX configuration of enterprise logging broken (fix).
      20215: (RECORD ONLY) Merge from V3.3 to V3.3-BUG-FIX
         r20178 JodConverter loggers are now exposed in JMX.
      20218: (RECORD ONLY) Merged BRANCHES/V3.3 to BRANCHES/DEV/V3.3-BUG-FIX:
         20195: Form fields for numbers are now rendered much smaller that ...
      20248: (RECORD ONLY) Merging HEAD into V3.3
      20284: (RECORD ONLY) Merged BRANCHES/V3.3 to BRANCHES/DEV/V3.3-BUG-FIX:
         20177: Add 'MaxPermSize' setting for DOD JUnit tests
      20305: (RECORD ONLY) Merged BRANCHES/V3.3 to BRANCHES/DEV/V3.3-BUG-FIX:
         20236: Add Oracle support for creating/dropping "databases" (users) in continuous.xml
         20264: Updated Oracle build support (to fix grants)
   20435: Merged BRANCHES/DEV/V3.3-BUG-FIX to BRANCHES/V3.3:
      20233: Part fix for ALF-2811: DOD5015 module breaks CMIS tck
      20239: Final part of fix for ALF-2811: DOD5015 module breaks CMIS tck
      20250: Merge from DEV/BELARUS/HEAD-2010_04_28 to V3.3-BUG-FIX
         20230 ALF-2450: latin/utf-8 HTML file cannot be text-extracted.
      20253: ALF-2629 Now tasks should correctly be ended when an associated timer is triggered. Should no longer cause WCM workflows to fail.
      20254: ALF-2579 Changed teh status code on incorrect password to '401' to reflect that it is an authorisation error.
      20263: Fix for ALF-2500: query with a ! in contains search make it strange
      20265: Fix for ALF-1495. Reindexing of OOo-transformed content after OOo crash.
   20436: Merged BRANCHES/DEV/V3.3-BUG-FIX to BRANCHES/V3.3:
      20292: (RECORD ONLY) Latest SpringSurf libs:
      20308: (RECORD ONLY) Latest SpringSurf libs:
      20366: (RECORD ONLY) Latest SpringSurf libs:
      20415: Latest SpringSurf libs:
   20437: Merged BRANCHES/DEV/V3.3-BUG-FIX to BRANCHES/V3.3:
      20270: Build times: SearchTestSuite
      20273: Fix for ALF-2125 - Accessing a deleted page in Share does not return an error page, instead the document-details page breaks
      20274: Fix for ALF-2518: It's impossible to find user by user name in Add User or Group window at Manage permissions page (also allows users to be found by username in the Share Admin Console).
      20277: Fix for ALF-2417: Create Web Content Wizard if cancelling/aborting Step Two - Author Web Content, any asset being uploaded gets locked
      20291: Reduce build time: Added security test suite to cover 17 security tests 
   20439: Merged BRANCHES/DEV/V3.3-BUG-FIX to BRANCHES/V3.3:
      20302: Fixed ALF-727:  Oracle iBatis fails on PropertyValueDAOTest Double.MAX_VALUE
      20307: VersionStore - minor fixes if running deprecated V1 
      20310: Fixed a bug in UIContentSelector which was building lucene search queries incorrectly.
      20314: Fix for ALF-2789 - DispatcherServlet not correctly retrieving Object ID from request parameters
      20320: Merged DEV/TEMPORARY to V3.3-BUG-FIX
         20313: ALF-2507: Not able to email space users even if the user owns the space 
      20324: Fixed ALF-2078 "Content doesn't make checked in after applying 'Check-in' rule in Share"
      20327: Fix Quickr project to compile in Eclipse
      20367: ALF-2829: Avoid reading entire result set into memory in FixNameCrcValuesPatch
      20368: Work-around for ALF-2366: patch.updateDmPermissions takes too long to complete
      20369: Part 1 of fix for ALF-2943: Update incorrect mimetypes (Excel and Powerpoint)
      20370: Version Migrator (ALF-1000) - use common batch processor to enable multiple workers
      20373: Version Migrator (ALF-1000) - resolve runtime conflict (w/ r20334)
      20378: Merged BRANCHES/DEV/BELARUS/HEAD-2010_04_28 to BRANCHES/DEV/V3.3-BUG-FIX:
         20312: ALF-2162: Error processing WCM form: XFormsBindingException: property 'constraint' already present at model item
      20381: Fixed ALF-2943: Update incorrect mimetypes (Excel and Powerpoint)


git-svn-id: https://svn.alfresco.com/repos/alfresco-enterprise/alfresco/HEAD/root@20571 c4b6b30b-aa2e-2d43-bbcb-ca4b014f7261
2010-06-09 14:01:07 +00:00

602 lines
25 KiB
Java

/*
* Copyright (C) 2009-2010 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
*/
package org.alfresco.repo.transfer;
import java.io.File;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import org.alfresco.model.ContentModel;
import org.alfresco.repo.transfer.CorrespondingNodeResolver.ResolvedParentChildPair;
import org.alfresco.repo.transfer.manifest.TransferManifestDeletedNode;
import org.alfresco.repo.transfer.manifest.TransferManifestHeader;
import org.alfresco.repo.transfer.manifest.TransferManifestNode;
import org.alfresco.repo.transfer.manifest.TransferManifestNormalNode;
import org.alfresco.service.cmr.repository.ChildAssociationRef;
import org.alfresco.service.cmr.repository.ContentData;
import org.alfresco.service.cmr.repository.ContentService;
import org.alfresco.service.cmr.repository.ContentWriter;
import org.alfresco.service.cmr.repository.NodeRef;
import org.alfresco.service.cmr.repository.NodeService;
import org.alfresco.service.cmr.repository.StoreRef;
import org.alfresco.service.cmr.transfer.TransferReceiver;
import org.alfresco.service.namespace.QName;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
/**
* @author brian
*
*/
public class RepoPrimaryManifestProcessorImpl extends AbstractManifestProcessorBase
{
private static final Log log = LogFactory.getLog(RepoPrimaryManifestProcessorImpl.class);
private static final String MSG_NO_PRIMARY_PARENT_SUPPLIED = "transfer_service.receiver.no_primary_parent_supplied";
private static final String MSG_ORPHANS_EXIST = "transfer_service.receiver.orphans_exist";
private static final String MSG_REFERENCED_CONTENT_FILE_MISSING = "transfer_service.receiver.content_file_missing";
protected static final Set<QName> DEFAULT_LOCAL_PROPERTIES = new HashSet<QName>();
static
{
DEFAULT_LOCAL_PROPERTIES.add(ContentModel.PROP_STORE_IDENTIFIER);
DEFAULT_LOCAL_PROPERTIES.add(ContentModel.PROP_STORE_NAME);
DEFAULT_LOCAL_PROPERTIES.add(ContentModel.PROP_STORE_PROTOCOL);
DEFAULT_LOCAL_PROPERTIES.add(ContentModel.PROP_NODE_DBID);
DEFAULT_LOCAL_PROPERTIES.add(ContentModel.PROP_NODE_REF);
DEFAULT_LOCAL_PROPERTIES.add(ContentModel.PROP_NODE_UUID);
}
private NodeService nodeService;
private ContentService contentService;
private CorrespondingNodeResolver nodeResolver;
private Map<NodeRef, List<ChildAssociationRef>> orphans = new HashMap<NodeRef, List<ChildAssociationRef>>(89);
/**
* @param transferId
*/
public RepoPrimaryManifestProcessorImpl(TransferReceiver receiver, String transferId)
{
super(receiver, transferId);
}
/*
* (non-Javadoc)
*
* @seeorg.alfresco.repo.transfer.manifest.TransferManifestProcessor# endTransferManifest()
*/
protected void endManifest()
{
if (!orphans.isEmpty())
{
error(MSG_ORPHANS_EXIST);
}
}
/**
*
*/
protected void processNode(TransferManifestDeletedNode node)
{
// This is a deleted node. First we need to check whether it has already been deleted in this repo
// too by looking in the local archive store. If we find it then we need not do anything.
// If we can't find it in our archive store then we'll see if we can find a corresponding node in the
// store in which its old parent lives.
// If we can find a corresponding node then we'll delete it.
// If we can't find a corresponding node then we'll do nothing.
logProgress("Processing incoming deleted node: " + node.getNodeRef());
if (!nodeService.exists(node.getNodeRef()))
{
// It's not in our archive store. Check to see if we can find it in
// its original store...
ChildAssociationRef origPrimaryParent = node.getPrimaryParentAssoc();
NodeRef origNodeRef = new NodeRef(origPrimaryParent.getParentRef().getStoreRef(), node.getNodeRef().getId());
CorrespondingNodeResolver.ResolvedParentChildPair resolvedNodes = nodeResolver.resolveCorrespondingNode(
origNodeRef, origPrimaryParent, node.getParentPath());
// Does a corresponding node exist in this repo?
if (resolvedNodes.resolvedChild != null)
{
// Yes, it does. Delete it.
if (log.isDebugEnabled())
{
log.debug("Incoming deleted noderef " + node.getNodeRef()
+ " has been resolved to existing local noderef " + resolvedNodes.resolvedChild
+ " - deleting");
}
logProgress("Deleting local node: " + resolvedNodes.resolvedChild);
nodeService.deleteNode(resolvedNodes.resolvedChild);
if (log.isDebugEnabled())
{
log.debug("Deleted local node: " + resolvedNodes.resolvedChild);
}
}
else
{
logProgress("Unable to find corresponding node for incoming deleted node: " + node.getNodeRef());
if (log.isDebugEnabled())
{
log.debug("Incoming deleted noderef has no corresponding local noderef: " + node.getNodeRef()
+ " - ignoring");
}
}
}
else
{
logProgress("Incoming deleted node is already in the local archive store - ignoring: " + node.getNodeRef());
}
}
/*
* (non-Javadoc)
*
* @seeorg.alfresco.repo.transfer.manifest.TransferManifestProcessor#
* processTransferManifestNode(org.alfresco.repo.transfer .manifest.TransferManifestNode)
*/
protected void processNode(TransferManifestNormalNode node)
{
if (log.isDebugEnabled())
{
log.debug("Processing node with incoming noderef of " + node.getNodeRef());
}
logProgress("Processing incoming node: " + node.getNodeRef() + " -- Source path = " + node.getParentPath() + "/" + node.getPrimaryParentAssoc().getQName());
ChildAssociationRef primaryParentAssoc = node.getPrimaryParentAssoc();
if (primaryParentAssoc == null)
{
error(node, MSG_NO_PRIMARY_PARENT_SUPPLIED);
}
CorrespondingNodeResolver.ResolvedParentChildPair resolvedNodes = nodeResolver.resolveCorrespondingNode(node
.getNodeRef(), primaryParentAssoc, node.getParentPath());
// Does a corresponding node exist in this repo?
if (resolvedNodes.resolvedChild != null)
{
// Yes, it does. Update it.
if (log.isDebugEnabled())
{
log.debug("Incoming noderef " + node.getNodeRef() + " has been resolved to existing local noderef "
+ resolvedNodes.resolvedChild);
}
update(node, resolvedNodes, primaryParentAssoc);
}
else
{
// No, there is no corresponding node. Worth just quickly checking
// the archive store...
NodeRef archiveNodeRef = new NodeRef(StoreRef.STORE_REF_ARCHIVE_SPACESSTORE, node.getNodeRef().getId());
if (nodeService.exists(archiveNodeRef))
{
// We have found a node in the archive store that has the same
// UUID as the one that we've
// been sent. We'll restore that archived node to a temporary
// location and then try
// processing this node again
if (log.isInfoEnabled())
{
log.info("Located an archived node with UUID matching transferred node: " + archiveNodeRef);
log.info("Attempting to restore " + archiveNodeRef);
}
logProgress("Restoring node from archive: " + archiveNodeRef);
ChildAssociationRef tempLocation = getTemporaryLocation(node.getNodeRef());
NodeRef restoredNodeRef = nodeService.restoreNode(archiveNodeRef, tempLocation.getParentRef(),
tempLocation.getTypeQName(), tempLocation.getQName());
if (log.isInfoEnabled())
{
log.info("Successfully restored node as " + restoredNodeRef + " - retrying transferred node");
}
//Check to see if the node we've just restored is the parent of any orphans
checkOrphans(restoredNodeRef);
//Process the received node information again now that we've restored it
//(should be handled as an update now)
processTransferManifestNode(node);
return;
}
if (log.isDebugEnabled())
{
log.debug("Incoming noderef has no corresponding local noderef: " + node.getNodeRef());
}
create(node, resolvedNodes, primaryParentAssoc);
}
}
/**
*
* @param node
* @param resolvedNodes
* @param primaryParentAssoc
*/
private void create(TransferManifestNormalNode node, ResolvedParentChildPair resolvedNodes,
ChildAssociationRef primaryParentAssoc)
{
log.info("Creating new node with noderef " + node.getNodeRef());
logProgress("Creating new node to correspond to incoming node: " + node.getNodeRef());
QName parentAssocType = primaryParentAssoc.getTypeQName();
QName parentAssocName = primaryParentAssoc.getQName();
NodeRef parentNodeRef = resolvedNodes.resolvedParent;
if (parentNodeRef == null)
{
if (log.isDebugEnabled())
{
log.debug("Unable to resolve parent for inbound noderef " + node.getNodeRef()
+ ".\n Supplied parent noderef is " + primaryParentAssoc.getParentRef()
+ ".\n Supplied parent path is " + node.getParentPath().toString());
}
// We can't find the node's parent.
// We'll store the node in a temporary location and record it for
// later processing
ChildAssociationRef tempLocation = getTemporaryLocation(node.getNodeRef());
parentNodeRef = tempLocation.getParentRef();
parentAssocType = tempLocation.getTypeQName();
parentAssocName = tempLocation.getQName();
log.info("Recording orphaned transfer node: " + node.getNodeRef());
logProgress("Unable to resolve parent for new incoming node. Storing it in temp folder: " + node.getNodeRef());
storeOrphanNode(primaryParentAssoc);
}
// We now know that this is a new node, and we have found the
// appropriate parent node in the
// local repository.
log.info("Resolved parent node to " + parentNodeRef);
// We need to process content properties separately.
// First, create a shallow copy of the supplied property map...
Map<QName, Serializable> props = new HashMap<QName, Serializable>(node.getProperties());
// Split out the content properties and sanitise the others
Map<QName, Serializable> contentProps = processProperties(null, props, true);
// Create the corresponding node...
ChildAssociationRef newNode = nodeService.createNode(parentNodeRef, parentAssocType, parentAssocName, node
.getType(), props);
if (log.isDebugEnabled())
{
log.debug("Created new node (" + newNode.getChildRef() + ") parented by node " + newNode.getParentRef());
}
// Deal with the content properties
writeContent(newNode.getChildRef(), contentProps);
// Apply any aspects that are needed but haven't automatically been
// applied
Set<QName> aspects = new HashSet<QName>(node.getAspects());
aspects.removeAll(nodeService.getAspects(newNode.getChildRef()));
for (QName aspect : aspects)
{
nodeService.addAspect(newNode.getChildRef(), aspect, null);
}
// Is the node that we've just added the parent of any orphans that
// we've found earlier?
checkOrphans(newNode.getChildRef());
}
private void checkOrphans(NodeRef parentNode)
{
List<ChildAssociationRef> orphansToClaim = orphans.get(parentNode);
if (orphansToClaim != null)
{
// Yes, it is...
for (ChildAssociationRef orphan : orphansToClaim)
{
logProgress("Re-parenting previously orphaned node (" + orphan.getChildRef() + ") with found parent " + orphan.getParentRef());
nodeService.moveNode(orphan.getChildRef(), orphan.getParentRef(), orphan.getTypeQName(), orphan
.getQName());
}
// We can now remove the record of these orphans, as their parent
// has been found
orphans.remove(parentNode);
}
}
/**
*
* @param node
* @param resolvedNodes
* @param primaryParentAssoc
*/
private void update(TransferManifestNormalNode node, ResolvedParentChildPair resolvedNodes,
ChildAssociationRef primaryParentAssoc)
{
NodeRef nodeToUpdate = resolvedNodes.resolvedChild;
logProgress("Updating local node: " + node.getNodeRef());
QName parentAssocType = primaryParentAssoc.getTypeQName();
QName parentAssocName = primaryParentAssoc.getQName();
NodeRef parentNodeRef = resolvedNodes.resolvedParent;
if (parentNodeRef == null)
{
// We can't find the node's parent.
// We'll store the node in a temporary location and record it for
// later processing
ChildAssociationRef tempLocation = getTemporaryLocation(node.getNodeRef());
parentNodeRef = tempLocation.getParentRef();
parentAssocType = tempLocation.getTypeQName();
parentAssocName = tempLocation.getQName();
storeOrphanNode(primaryParentAssoc);
}
// First of all, do we need to move the node? If any aspect of the
// primary parent association has changed
// then the answer is "yes"
ChildAssociationRef currentParent = nodeService.getPrimaryParent(nodeToUpdate);
if (!currentParent.getParentRef().equals(parentNodeRef)
|| !currentParent.getTypeQName().equals(parentAssocType)
|| !currentParent.getQName().equals(parentAssocName))
{
// Yes, we need to move the node
nodeService.moveNode(nodeToUpdate, parentNodeRef, parentAssocType, parentAssocName);
logProgress("Moved node " + nodeToUpdate + " to be under parent node " + parentNodeRef);
}
log.info("Resolved parent node to " + parentNodeRef);
if (updateNeeded(node, nodeToUpdate))
{
// We need to process content properties separately.
// First, create a shallow copy of the supplied property map...
Map<QName, Serializable> props = new HashMap<QName, Serializable>(node.getProperties());
// Split out the content properties and sanitise the others
Map<QName, Serializable> contentProps = processProperties(nodeToUpdate, props, false);
// Update the non-content properties
nodeService.setProperties(nodeToUpdate, props);
// Deal with the content properties
writeContent(nodeToUpdate, contentProps);
// Blend the aspects together
Set<QName> suppliedAspects = new HashSet<QName>(node.getAspects());
Set<QName> existingAspects = nodeService.getAspects(nodeToUpdate);
Set<QName> aspectsToRemove = new HashSet<QName>(existingAspects);
aspectsToRemove.removeAll(suppliedAspects);
suppliedAspects.removeAll(existingAspects);
// Now aspectsToRemove contains the set of aspects to remove
// and suppliedAspects contains the set of aspects to add
for (QName aspect : suppliedAspects)
{
nodeService.addAspect(nodeToUpdate, aspect, null);
}
for (QName aspect : aspectsToRemove)
{
nodeService.removeAspect(nodeToUpdate, aspect);
}
}
}
/**
* This method takes all the received properties and separates them into two parts. The content properties are
* removed from the non-content properties such that the non-content properties remain in the "props" map and the
* content properties are returned from this method Subsequently, any properties that are to be retained from the
* local repository are copied over into the "props" map. The result of all this is that, upon return, "props"
* contains all the non-content properties that are to be written to the local repo, and "contentProps" contains all
* the content properties that are to be written to the local repo.
*
* @param nodeToUpdate
* The noderef of the existing node in the local repo that is to be updated with these properties. May be
* null, indicating that these properties are destined for a brand new local node.
* @param props
* @return A map containing the content properties from the supplied "props" map
*/
private Map<QName, Serializable> processProperties(NodeRef nodeToUpdate, Map<QName, Serializable> props,
boolean isNew)
{
Map<QName, Serializable> contentProps = new HashMap<QName, Serializable>();
// ...and copy any supplied content properties into this new map...
for (Map.Entry<QName, Serializable> propEntry : props.entrySet())
{
Serializable value = propEntry.getValue();
if (log.isDebugEnabled())
{
if (value == null)
{
log.debug("Received a null value for property " + propEntry.getKey());
}
}
if ((value != null) && ContentData.class.isAssignableFrom(value.getClass()))
{
contentProps.put(propEntry.getKey(), propEntry.getValue());
}
}
// Now we can remove the content properties from amongst the other kinds
// of properties
// (no removeAll on a Map...)
for (QName contentPropertyName : contentProps.keySet())
{
props.remove(contentPropertyName);
}
if (!isNew)
{
// Finally, overlay the repo-specific properties from the existing
// node (if there is one)
Map<QName, Serializable> existingProps = (nodeToUpdate == null) ? new HashMap<QName, Serializable>()
: nodeService.getProperties(nodeToUpdate);
for (QName localProperty : getLocalProperties())
{
Serializable existingValue = existingProps.get(localProperty);
if (existingValue != null)
{
props.put(localProperty, existingValue);
}
else
{
props.remove(localProperty);
}
}
}
return contentProps;
}
/**
* @param node
* @param nodeToUpdate
* @param contentProps
*/
private void writeContent(NodeRef nodeToUpdate, Map<QName, Serializable> contentProps)
{
File stagingDir = getStagingFolder();
for (Map.Entry<QName, Serializable> contentEntry : contentProps.entrySet())
{
ContentData contentData = (ContentData) contentEntry.getValue();
String contentUrl = contentData.getContentUrl();
String fileName = contentUrl.substring(contentUrl.lastIndexOf('/') + 1);
File stagedFile = new File(stagingDir, fileName);
if (!stagedFile.exists())
{
error(MSG_REFERENCED_CONTENT_FILE_MISSING);
}
ContentWriter writer = contentService.getWriter(nodeToUpdate, contentEntry.getKey(), true);
writer.setEncoding(contentData.getEncoding());
writer.setMimetype(contentData.getMimetype());
writer.setLocale(contentData.getLocale());
writer.putContent(stagedFile);
}
}
protected boolean updateNeeded(TransferManifestNormalNode node, NodeRef nodeToUpdate)
{
boolean updateNeeded = true;
// Assumption: if the modified and modifier properties haven't changed, and the cm:content property
// (if it exists) hasn't changed size then we can assume that properties don't need to be updated...
// Map<QName, Serializable> suppliedProps = node.getProperties();
// Date suppliedModifiedDate = (Date) suppliedProps.get(ContentModel.PROP_MODIFIED);
// String suppliedModifier = (String) suppliedProps.get(ContentModel.PROP_MODIFIER);
// ContentData suppliedContent = (ContentData) suppliedProps.get(ContentModel.PROP_CONTENT);
//
// Map<QName, Serializable> existingProps = nodeService.getProperties(nodeToUpdate);
// Date existingModifiedDate = (Date) existingProps.get(ContentModel.PROP_MODIFIED);
// String existingModifier = (String) existingProps.get(ContentModel.PROP_MODIFIER);
// ContentData existingContent = (ContentData) existingProps.get(ContentModel.PROP_CONTENT);
//
// updateNeeded = false;
// updateNeeded |= ((suppliedModifiedDate != null && !suppliedModifiedDate.equals(existingModifiedDate)) ||
// (existingModifiedDate != null && !existingModifiedDate.equals(suppliedModifiedDate)));
// updateNeeded |= ((suppliedContent != null && existingContent == null)
// || (suppliedContent == null && existingContent != null) || (suppliedContent != null
// && existingContent != null && suppliedContent.getSize() != existingContent.getSize()));
// updateNeeded |= ((suppliedModifier != null && !suppliedModifier.equals(existingModifier)) ||
// (existingModifier != null && !existingModifier.equals(suppliedModifier)));
return updateNeeded;
}
/**
* @return
*/
protected Set<QName> getLocalProperties()
{
return DEFAULT_LOCAL_PROPERTIES;
}
/**
* @param primaryParentAssoc
*/
private void storeOrphanNode(ChildAssociationRef primaryParentAssoc)
{
List<ChildAssociationRef> orphansOfParent = orphans.get(primaryParentAssoc.getParentRef());
if (orphansOfParent == null)
{
orphansOfParent = new ArrayList<ChildAssociationRef>();
orphans.put(primaryParentAssoc.getParentRef(), orphansOfParent);
}
orphansOfParent.add(primaryParentAssoc);
}
/**
* @param node
* @param msgId
*/
private void error(TransferManifestNode node, String msgId)
{
TransferProcessingException ex = new TransferProcessingException(msgId);
log.error(ex.getMessage(), ex);
throw ex;
}
/**
* @param msgId
*/
private void error(String msgId)
{
TransferProcessingException ex = new TransferProcessingException(msgId);
log.error(ex.getMessage(), ex);
throw ex;
}
protected void processHeader(TransferManifestHeader header)
{
}
/*
* (non-Javadoc)
*
* @seeorg.alfresco.repo.transfer.manifest.TransferManifestProcessor# startTransferManifest()
*/
protected void startManifest()
{
}
/**
* @param nodeService
* the nodeService to set
*/
public void setNodeService(NodeService nodeService)
{
this.nodeService = nodeService;
}
/**
* @param contentService
* the contentService to set
*/
public void setContentService(ContentService contentService)
{
this.contentService = contentService;
}
/**
* @param nodeResolver
* the nodeResolver to set
*/
public void setNodeResolver(CorrespondingNodeResolver nodeResolver)
{
this.nodeResolver = nodeResolver;
}
}