Merged V3.3 to HEAD

20794: Merged DEV/V3.3-BUG-FIX to V3.3
      20792: Fix for unit test failures introduced by check in 20771
      20791: ALF-3568: Include axiom jars in WAS shared library to solve Quickr connector issues
      20785: Merged DEV/BELARUS/V3.3-BUG-FIX-2010_06_14 to DEV/V3.3-BUG-FIX
         20644: Function for the browser window closing was implemented. For IE browser the trick with window opener was used. Fixes ALF-1004: After closing Details Space, user doesn't return to his previous location
      20784: Fix for ALF-3516: Enterprise 3.X / Impossible to Create a Blog with Special Characters in the Title (?/!)
      20783: Fix for ALF-1087: Documents checked-out from Share do not have "Upload new version" action in Alfresco Explorer
      20782: Added multiday timed event handling to week view
      20775: Merged V3.3 to DEV/V3.3-BUG-FIX
         20670: Fix for ALF-3260: XSS attack is made in Wiki tab if First/Last user name contain xss. Also fixed double encoding errors found during regression testing.
      20772: Update to node browser to show namespace of attributes.
      20771: ALF-3591 - transferring rules.
         - also extends the behaviour filter.
      20770: ALF-3186 - action parameter values are not fully transferred - need to handle d:any
      20768: AVM - ALF-3611 (OrphanReaper + PurgeTestP + additional NPE fixes)
      20765: (RECORD ONLY) Merged BRANCHES/V3.3 to BRANCHES/DEV/V3.3-BUG-FIX:
         20708: DB2 build - add create/drop db ant targets (use DB2 cmdline - since not possible via JDBC/SQL)
         20722: DB2 build - run db2cmd in same window (follow-on to r20708)
      20764: Fix unreported JSON encoding issue with links components
      20762: Fix ALF-2599: Share - Cannot search for user currently logged on
      20759: DB2: fix FullNodeServiceTest.testLongMLTextValues (ALF-497)
         - TODO: fix create script when merging to HEAD
      20756: DB2: fix JBPMEngine*Test.* (ALF-3640) - follow-on (upgrade patch)
      20746: DB2: fix WebProjectServiceImplTest.testCreateWebProject (ALF-2300)
      20744: DB2: fix JBPMEngine*Test.* (ALF-3640) - missed file
      20743: DB2: fix JBPMEngine*Test.* (ALF-3640)
      20729: AVM - fix purge store so that root nodes are actually orphaned (ALF-3627)
         - also prelim for ALF-3611
      20720: (RECORD ONLY) ALF-3594: Merged HEAD to V3.3-BUGFIX
         20616: ALF-2265: Share 'Uber Filter' part 2
            - WebScriptNTLMAuthenticationFilter detached from its superclass and renamed to WebScriptSSOAuthenticationFilter
            - Now the filter simply chains to the downstream authentication filter rather than call its superclass
            - This means the same filter can be used for Kerberos-protected webscripts as well as NTLM
            - Wired globalAuthenticationFilter behind webscriptAuthenticationFilter in the filter chain in web.xml
            - Configured webscriptAuthenticationFilter for Kerberos subsystem
      20719: Merged DEV/TEMPORARY to V3.3-BUGFIX
         20696: ALF-3180: when using NTLM SSO, a user needs to log in first into the web UI before being able to mount alfresco using CIFS
            The absence of the missing person creation logic in “the org.alfresco.filesys.auth.cifs.PassthruCifsAuthenticator.authenticateUser()” method was fixed. 
      20718: Merged DEV/TEMPORARY to V3.3-BUGFIX
         20659: ALF-3216: Incomplete settings for Lotus Quickr
            The protocol,host,port and context are removed from properties and a dependency on the org.alfresco.repo.admin.SysAdminParams interface is introduced.
      20711: Latest SpringSurf libs - fix for ALF-3557
      20710: Merged HEAD to BRANCHES/DEV/V3.3-BUG-FIX:
         20705: Fix ALF-3585: AtomPub summary can render first part of binary content resulting in invalid XML
      20691: Merged DEV/TEMPORARY to V3.3-BUGFIX
         19404: ALF-220: Editor can't rename files and folders via WebDav
            The Rename method of FileFolderService was used in case of file renaming instead of move method in WebDAV MOVE command.
      20663: ALF-3208 RenderingEngine actions should no longer appear in the list of available actions that can be fired using rules.
      20656: ALF-2645: LDAP sync now logs 'dangling references' for debugging purposes
      20651: ALF-485: FTP passthru authenticator logs authentication failures at debug level to avoid noise in the logs
      20646: Merge V2.2 To V3.3
         14301 : RECORD ONLY - ETWOTWO-1227 - fix to serialize FSR deployments.
         14618 : RECORD ONLY - Merge HEAD to 2.2 13944 : After rename project deploy option disappears.
      20637: ALF-3123: Avoid NPE on Oracle when loading empty string values persisted through JMX and the attribute service
      20633: ALF-2057: LDAP synchronization lock now persists for a maximum of two minutes (instead of 24 hours!)
         - The exclusive lock gained for LDAP sync from the JobLockService is now refreshed at 1 minute intervals and never persists for more than 2 minutes
      20628: ALF-1905: Allow use of anonymous bind for LDAP synchronization (NOT authentication)
         - Previously synchronization AND authentication shared the same setting for java.naming.security.authentication, meaning that if you tried to use anonymous bind for the synchronization side, the authentication side would complain.
         - Now there are two independent environments declared for the 'default' synchronization connection and the authentication connection
         - A new property ldap.synchronization.java.naming.security.authentication declares the authentication type used by synchronization. Set to "none" for anonymous bind.
      20623: Fix for ALF-3188 : Access Denied when updating doc via CIFS
      20620: Merge DEV to V3.3-BUG-FIX
         20456 -  ALF-1824 : Setting alfresco.rmi.services.host on linux does not use specified host/IP
      20617: Merged DEV/BELARUS/V3.3-2010_06_08 to V3.3-BUG-FIX (with corrections)
         20606: ALF-651: Web Services client ContentUtils.convertToByteArray is broken
            - org.alfresco.webservice.util.ContentUtils.convertToByteArray() method has been updated to cover large Input Streams conversion.
            - org.alfresco.webservice.test.ContentUtilsTest is a test for the new functionality implemented in the ContentUtils class.
            - org.alfresco.webservice.test.resources.big-content.pdf is a large content for the ContentUtilsTest.testInputStreamToByteArrayConversion() test.
      20613: Fixed ALF-1746: Metadata extractors are unable to remove ALL aspect-related properties
         - putRawValue keeps hold of 'null' values
         - All policies keep hold of 'null' values
         - Only affects 'carryAspectProperties=false'
      20609: Merged HEAD to V3.3-BUG-FIX
         20578: ALF-3178 - Transfer Service - to transfer rule (ie. ruleFolder with it's children) the PathHelper should allow "-" (dash character)
         20608: ALF-3178 - fix r20578 (mis-applied patch)
      20594: WebDAV BitKinex compatibility fix - Let the XML Parser work out the body encoding if it is not declared in the Content-Type header
      20588: (RECORD ONLY) Merged V3.3 to V3.3-BUG-FIX
         - Merged across all differences from V3.3
   20778: Added revision to version label.
   20777: Fix for ALF-2451 - installer correctly configure Share port
   20722: DB2 build - run db2cmd in same window (follow-on to r20712)
   20721: DB2 build - fix create target and add "/c" to exit "db2cmd"
      - TODO: add wait/timeout target, ideally checking for created DB 


git-svn-id: https://svn.alfresco.com/repos/alfresco-enterprise/alfresco/HEAD/root@20796 c4b6b30b-aa2e-2d43-bbcb-ca4b014f7261
This commit is contained in:
Dave Ward
2010-06-24 15:47:38 +00:00
parent 0f1a1a4bc2
commit 9963da3d51
54 changed files with 871 additions and 835 deletions

View File

@@ -269,14 +269,17 @@ public class PassthruCifsAuthenticator extends CifsAuthenticatorBase implements
else
{
// Map the passthru username to an Alfresco person
String username = client.getUserName();
String personName = getPersonService().getUserIdentifier( username);
String personName = getPersonService().getUserIdentifier(username);
if (null == personName)
{
personName = username;
}
if ( personName != null)
{
// Use the person name as the current user
getAuthenticationComponent().setCurrentUser(personName);
alfClient.setAuthenticationTicket(getAuthenticationService().getCurrentTicket());

View File

@@ -415,7 +415,7 @@ public class PassthruFtpAuthenticator extends FTPAuthenticatorBase {
}
catch (Exception ex)
{
logger.error("Passthru error", ex);
logger.debug("Passthru error", ex);
}
finally {

View File

@@ -186,6 +186,11 @@ public class ContentMetadataExtracter extends ActionExecuterAbstractBase
Set<QName> requiredAspectQNames = new HashSet<QName>(3);
Set<QName> aspectPropertyQNames = new HashSet<QName>(17);
/**
* The modified properties contain null values as well. As we are only interested
* in the keys, this will force aspect aspect properties to be removed even if there
* are no settable properties pertaining to the aspect.
*/
for (QName propertyQName : modifiedProperties.keySet())
{
PropertyDefinition propertyDef = dictionaryService.getProperty(propertyQName);
@@ -212,6 +217,12 @@ public class ContentMetadataExtracter extends ActionExecuterAbstractBase
{
if (!modifiedProperties.containsKey(aspectPropertyQName))
{
// Simple case: This property was not extracted
nodeProperties.remove(aspectPropertyQName);
}
else if (modifiedProperties.get(aspectPropertyQName) == null)
{
// Trickier (ALF-1823): The property was extracted as 'null'
nodeProperties.remove(aspectPropertyQName);
}
}

View File

@@ -123,9 +123,9 @@ public class ContentMetadataExtracterTest extends BaseSpringTest
private static final QName PROP_UNKNOWN_1 = QName.createQName(NamespaceService.CONTENT_MODEL_1_0_URI, "unkown1");
private static final QName PROP_UNKNOWN_2 = QName.createQName(NamespaceService.CONTENT_MODEL_1_0_URI, "unkown2");
private static class UnknownMetadataExtracter extends AbstractMappingMetadataExtracter
private static class TestUnknownMetadataExtracter extends AbstractMappingMetadataExtracter
{
public UnknownMetadataExtracter()
public TestUnknownMetadataExtracter()
{
Properties mappingProperties = new Properties();
mappingProperties.put("unknown1", PROP_UNKNOWN_1.toString());
@@ -156,7 +156,7 @@ public class ContentMetadataExtracterTest extends BaseSpringTest
public void testUnknownProperties()
{
MetadataExtracterRegistry registry = (MetadataExtracterRegistry) applicationContext.getBean("metadataExtracterRegistry");
UnknownMetadataExtracter extracterUnknown = new UnknownMetadataExtracter();
TestUnknownMetadataExtracter extracterUnknown = new TestUnknownMetadataExtracter();
extracterUnknown.setRegistry(registry);
extracterUnknown.register();
// Now add some content with a binary mimetype
@@ -174,6 +174,85 @@ public class ContentMetadataExtracterTest extends BaseSpringTest
assertNotNull("Unknown property is null", prop1);
assertNotNull("Unknown property is null", prop2);
}
private static class TestNullPropMetadataExtracter extends AbstractMappingMetadataExtracter
{
public TestNullPropMetadataExtracter()
{
Properties mappingProperties = new Properties();
mappingProperties.put("title", ContentModel.PROP_TITLE.toString());
mappingProperties.put("description", ContentModel.PROP_DESCRIPTION.toString());
setMappingProperties(mappingProperties);
}
@Override
protected Map<String, Set<QName>> getDefaultMapping()
{
// No need to give anything back as we have explicitly set the mapping already
return new HashMap<String, Set<QName>>(0);
}
@Override
public boolean isSupported(String sourceMimetype)
{
return sourceMimetype.equals(MimetypeMap.MIMETYPE_BINARY);
}
public Map<String, Serializable> extractRaw(ContentReader reader) throws Throwable
{
Map<String, Serializable> rawMap = newRawMap();
putRawValue("title", null, rawMap);
putRawValue("description", "", rawMap);
return rawMap;
}
}
/**
* Ensure that missing raw values result in node properties being removed
* when running with {@link ContentMetadataExtracter#setCarryAspectProperties(boolean)}
* set to <tt>false</tt>.
*/
public void testNullExtractedValues_ALF1823()
{
MetadataExtracterRegistry registry = (MetadataExtracterRegistry) applicationContext.getBean("metadataExtracterRegistry");
TestNullPropMetadataExtracter extractor = new TestNullPropMetadataExtracter();
extractor.setRegistry(registry);
extractor.register();
// Now set the title and description
nodeService.setProperty(nodeRef, ContentModel.PROP_TITLE, "TITLE");
nodeService.setProperty(nodeRef, ContentModel.PROP_DESCRIPTION, "DESCRIPTION");
// Now add some content with a binary mimetype
ContentWriter cw = this.contentService.getWriter(nodeRef, ContentModel.PROP_CONTENT, true);
cw.setMimetype(MimetypeMap.MIMETYPE_BINARY);
cw.putContent("Content for " + getName());
ActionImpl action = new ActionImpl(null, ID, SetPropertyValueActionExecuter.NAME, null);
executer.execute(action, this.nodeRef);
// cm:titled properties should be present
Serializable title = nodeService.getProperty(nodeRef, ContentModel.PROP_TITLE);
Serializable descr = nodeService.getProperty(nodeRef, ContentModel.PROP_DESCRIPTION);
assertNotNull("cm:title property is null", title);
assertNotNull("cm:description property is null", descr);
try
{
// Now change the setting to remove unset aspect properties
executer.setCarryAspectProperties(false);
// Extract again
executer.execute(action, this.nodeRef);
// cm:titled properties should *NOT* be present
title = nodeService.getProperty(nodeRef, ContentModel.PROP_TITLE);
descr = nodeService.getProperty(nodeRef, ContentModel.PROP_DESCRIPTION);
assertNull("cm:title property is not null", title);
assertNull("cm:description property is not null", descr);
}
finally
{
executer.setCarryAspectProperties(true);
}
}
/**
* Test execution of the pragmatic approach

View File

@@ -74,16 +74,6 @@ public class AVMDAOs
*/
public ChildEntryDAO fChildEntryDAO;
/**
* The HistoryLinkDAO.
*/
public HistoryLinkDAO fHistoryLinkDAO;
/**
* The MergeLinkDAO.
*/
public MergeLinkDAO fMergeLinkDAO;
/**
* The AVMStorePropertyDAO
*/
@@ -123,23 +113,7 @@ public class AVMDAOs
{
fChildEntryDAO = childEntryDAO;
}
/**
* @param historyLinkDAO the fHistoryLinkDAO to set
*/
public void setHistoryLinkDAO(HistoryLinkDAO historyLinkDAO)
{
fHistoryLinkDAO = historyLinkDAO;
}
/**
* @param mergeLinkDAO the fMergeLinkDAO to set
*/
public void setMergeLinkDAO(MergeLinkDAO mergeLinkDAO)
{
fMergeLinkDAO = mergeLinkDAO;
}
/**
* @param aVMStoreDAO The fAVMStoreDAO to set
*/

View File

@@ -26,6 +26,7 @@ import java.util.Set;
import org.alfresco.repo.avm.util.RawServices;
import org.alfresco.repo.domain.DbAccessControlList;
import org.alfresco.repo.domain.PropertyValue;
import org.alfresco.repo.domain.avm.AVMHistoryLinkEntity;
import org.alfresco.repo.security.permissions.ACLCopyMode;
import org.alfresco.service.cmr.avm.AVMReadOnlyException;
import org.alfresco.service.namespace.QName;
@@ -135,22 +136,19 @@ public abstract class AVMNodeImpl implements AVMNode
{
return;
}
HistoryLinkImpl link = new HistoryLinkImpl();
link.setAncestor(ancestor);
link.setDescendent(this);
AVMDAOs.Instance().fHistoryLinkDAO.save(link);
AVMDAOs.Instance().newAVMNodeLinksDAO.createHistoryLink(ancestor.getId(), this.getId());
}
/**
* Change the ancestor of this node.
* @param ancestor The new ancestor to give it.
*/
public void changeAncestor(AVMNode ancestor)
{
HistoryLink old = AVMDAOs.Instance().fHistoryLinkDAO.getByDescendent(this);
if (old != null)
AVMHistoryLinkEntity hlEntity = AVMDAOs.Instance().newAVMNodeLinksDAO.getHistoryLinkByDescendent(this.getId());
if (hlEntity != null)
{
AVMDAOs.Instance().fHistoryLinkDAO.delete(old);
AVMDAOs.Instance().newAVMNodeLinksDAO.deleteHistoryLink(hlEntity.getAncestorNodeId(), hlEntity.getDescendentNodeId());
}
setAncestor(ancestor);
}
@@ -174,10 +172,7 @@ public abstract class AVMNodeImpl implements AVMNode
{
return;
}
MergeLinkImpl link = new MergeLinkImpl();
link.setMfrom(mergedFrom);
link.setMto(this);
AVMDAOs.Instance().fMergeLinkDAO.save(link);
AVMDAOs.Instance().newAVMNodeLinksDAO.createMergeLink(mergedFrom.getId(), this.getId());
}
/**

View File

@@ -58,9 +58,9 @@ import org.alfresco.service.cmr.security.PermissionContext;
import org.alfresco.service.cmr.security.PermissionService;
import org.alfresco.service.namespace.QName;
import org.alfresco.util.FileNameValidator;
import org.alfresco.util.Pair;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.alfresco.util.Pair;
/**
* This or AVMStore are the implementors of the operations specified by AVMService.
@@ -1016,12 +1016,15 @@ public class AVMRepository
throw new AccessDeniedException("Not allowed to purge: " + name);
}
root.setIsRoot(false);
fAVMNodeDAO.update(root);
List<VersionRoot> vRoots = fVersionRootDAO.getAllInAVMStore(store);
for (VersionRoot vr : vRoots)
{
AVMNode node = fAVMNodeDAO.getByID(vr.getRoot().getId());
root.setIsRoot(false);
fAVMNodeDAO.update(node);
fVersionLayeredNodeEntryDAO.delete(vr);
@@ -1038,6 +1041,11 @@ public class AVMRepository
fAVMStoreDAO.delete(store);
fAVMStoreDAO.invalidateCache();
fPurgeStoreTxnListener.storePurged(name);
if (fgLogger.isDebugEnabled())
{
fgLogger.debug("Purged store: "+name);
}
}
/**
@@ -1059,6 +1067,11 @@ public class AVMRepository
fLookupCache.onDelete(name);
store.purgeVersion(version);
fPurgeVersionTxnListener.versionPurged(name, version);
if (fgLogger.isDebugEnabled())
{
fgLogger.debug("Purged version: "+name+" "+version);
}
}
/**

View File

@@ -844,8 +844,11 @@ public class AVMStoreImpl implements AVMStore
throw new AVMNotFoundException("Not allowed to delete in store : " + getName() +" at " + path);
}
dir.removeChild(lPath, name);
//dir.updateModTime();
if (dir != null)
{
dir.removeChild(lPath, name);
//dir.updateModTime();
}
}
/**

View File

@@ -315,6 +315,10 @@ class AVMTester implements Runnable
{
String name = fNames[fgRandom.nextInt(26 * 26)];
String path = randomPath();
if (path == null)
{
return;
}
AVMNodeDescriptor desc = fService.lookup(-1, path);
if (desc == null)
{
@@ -398,6 +402,10 @@ class AVMTester implements Runnable
private void removeNode()
{
String target = randomPath();
if (target == null)
{
return;
}
int lastSlash = target.lastIndexOf('/');
String path = target.substring(0, lastSlash);
if (path.equals("main:"))

View File

@@ -1,50 +0,0 @@
/*
* Copyright (C) 2005-2010 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>. */
package org.alfresco.repo.avm;
/**
* Interface for the ancestor-descendent relationship.
* @author britt
*/
public interface HistoryLink
{
/**
* Set the ancestor part of this.
* @param ancestor
*/
public void setAncestor(AVMNode ancestor);
/**
* Get the ancestor part of this.
* @return The ancestor.
*/
public AVMNode getAncestor();
/**
* Set the descendent part of this.
* @param descendent
*/
public void setDescendent(AVMNode descendent);
/**
* Get the descendent part of this.
* @return The descendent of this link.
*/
public AVMNode getDescendent();
}

View File

@@ -1,54 +0,0 @@
/*
* Copyright (C) 2005-2010 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>. */
package org.alfresco.repo.avm;
import java.util.List;
/**
* DAO for history links.
* @author britt
*/
public interface HistoryLinkDAO
{
/**
* Save and unsaved HistoryLink.
* @param link
*/
public void save(HistoryLink link);
/**
* Get the history link with the given descendent.
* @param descendent The descendent.
* @return The HistoryLink or null if not found.
*/
public HistoryLink getByDescendent(AVMNode descendent);
/**
* Get all the descendents of a node.
* @param ancestor The ancestor node.
* @return A List of AVMNode descendents.
*/
public List<HistoryLink> getByAncestor(AVMNode ancestor);
/**
* Delete a HistoryLink
* @param link The link to delete.
*/
public void delete(HistoryLink link);
}

View File

@@ -1,106 +0,0 @@
/*
* Copyright (C) 2005-2010 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>. */
package org.alfresco.repo.avm;
import java.io.Serializable;
/**
* Holds a ancestor-descendent relationship.
* @author britt
*/
public class HistoryLinkImpl implements HistoryLink, Serializable
{
private static final long serialVersionUID = -430859344980137718L;
/**
* The ancestor.
*/
private AVMNode fAncestor;
/**
* The descendent.
*/
private AVMNode fDescendent;
/**
* Set the ancestor part of this.
* @param ancestor
*/
public void setAncestor(AVMNode ancestor)
{
fAncestor = ancestor;
}
/**
* Get the ancestor part of this.
* @return The ancestor.
*/
public AVMNode getAncestor()
{
return fAncestor;
}
/**
* Set the descendent part of this.
* @param descendent
*/
public void setDescendent(AVMNode descendent)
{
fDescendent = descendent;
}
/**
* Get the descendent part of this.
* @return The descendent.
*/
public AVMNode getDescendent()
{
return fDescendent;
}
/**
* Equals override.
* @param obj
* @return Equality.
*/
@Override
public boolean equals(Object obj)
{
if (this == obj)
{
return true;
}
if (!(obj instanceof HistoryLink))
{
return false;
}
HistoryLink o = (HistoryLink)obj;
return fAncestor.equals(o.getAncestor()) && fDescendent.equals(o.getDescendent());
}
/**
* Get the hashcode.
* @return The hashcode.
*/
@Override
public int hashCode()
{
return fAncestor.hashCode() + fDescendent.hashCode();
}
}

View File

@@ -1,50 +0,0 @@
/*
* Copyright (C) 2005-2010 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>. */
package org.alfresco.repo.avm;
/**
* This is the interface for the merged from - to relationship.
* @author britt
*/
public interface MergeLink
{
/**
* Set the from part.
* @param from
*/
public void setMfrom(AVMNode from);
/**
* Get the from part.
* @return The from part.
*/
public AVMNode getMfrom();
/**
* Set the to part.
* @param to
*/
public void setMto(AVMNode to);
/**
* Get the to part.
* @return The to part.
*/
public AVMNode getMto();
}

View File

@@ -1,54 +0,0 @@
/*
* Copyright (C) 2005-2010 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>. */
package org.alfresco.repo.avm;
import java.util.List;
/**
* DAO for MergeLinks.
* @author britt
*/
public interface MergeLinkDAO
{
/**
* Save an unsaved MergeLink.
* @param link The link to save.
*/
public void save(MergeLink link);
/**
* Get a link from the merged to node.
* @param to The node merged to.
* @return An AVMNode or null if not found.
*/
public MergeLink getByTo(AVMNode to);
/**
* Get all the link that the given node was merged to.
* @param from The node that was merged from
* @return A List of MergeLinks.
*/
public List<MergeLink> getByFrom(AVMNode from);
/**
* Delete a link.
* @param link The link to delete.
*/
public void delete(MergeLink link);
}

View File

@@ -1,106 +0,0 @@
/*
* Copyright (C) 2005-2010 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>. */
package org.alfresco.repo.avm;
import java.io.Serializable;
/**
* This contains a single merged from-to relationship.
* @author britt
*/
public class MergeLinkImpl implements MergeLink, Serializable
{
private static final long serialVersionUID = 6672271083042424944L;
/**
* The node that was merged from.
*/
private AVMNode fFrom;
/**
* The node that was merged to.
*/
private AVMNode fTo;
/**
* Set the from part.
* @param from
*/
public void setMfrom(AVMNode from)
{
fFrom = from;
}
/**
* Get the from part.
* @return The from part.
*/
public AVMNode getMfrom()
{
return fFrom;
}
/**
* Set the to part.
* @param to
*/
public void setMto(AVMNode to)
{
fTo = to;
}
/**
* Get the to part.
* @return The to part.
*/
public AVMNode getMto()
{
return fTo;
}
/**
* Override of equals.
* @param obj
* @return Equality.
*/
@Override
public boolean equals(Object obj)
{
if (this == obj)
{
return true;
}
if (!(obj instanceof MergeLink))
{
return false;
}
MergeLink o = (MergeLink)obj;
return fFrom.equals(o.getMfrom()) && fTo.equals(o.getMto());
}
/**
* Get the hash code.
* @return The hash code.
*/
@Override
public int hashCode()
{
return fFrom.hashCode() + fTo.hashCode();
}
}

View File

@@ -22,12 +22,13 @@ import java.util.LinkedList;
import java.util.List;
import org.alfresco.repo.domain.DbAccessControlList;
import org.alfresco.repo.domain.avm.AVMHistoryLinkEntity;
import org.alfresco.repo.domain.avm.AVMMergeLinkEntity;
import org.alfresco.repo.transaction.RetryingTransactionHelper.RetryingTransactionCallback;
import org.alfresco.service.cmr.repository.ContentData;
import org.alfresco.service.transaction.TransactionService;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.hibernate.SessionFactory;
/**
* This is the background thread for reaping no longer referenced nodes in the AVM repository. These orphans arise from
@@ -43,9 +44,20 @@ public class OrphanReaper
{
if (fRunning)
{
if (fgLogger.isDebugEnabled())
{
fgLogger.debug("OrphanReaper is already running - just return");
}
return;
}
fRunning = true;
if (fgLogger.isTraceEnabled())
{
fgLogger.trace("Start running OrphanReaper ...");
}
}
try
{
@@ -54,14 +66,23 @@ public class OrphanReaper
doBatch();
if (fDone)
{
if (fgLogger.isTraceEnabled())
{
fgLogger.trace("OrphanReaper is done - just return");
}
return;
}
try
{
if (fgLogger.isTraceEnabled())
{
fgLogger.trace("OrphanReaper is not done - sleep for "+fActiveBaseSleep+" ms");
}
Thread.sleep(fActiveBaseSleep);
}
catch (InterruptedException e)
{
fgLogger.warn("OrphanReaper was interrupted - do nothing: "+e);
// Do nothing.
}
}
@@ -72,6 +93,11 @@ public class OrphanReaper
synchronized (this)
{
fRunning = false;
if (fgLogger.isTraceEnabled())
{
fgLogger.trace("... finish running OrphanReaper");
}
}
}
}
@@ -239,6 +265,11 @@ public class OrphanReaper
List<AVMNode> nodes = AVMDAOs.Instance().fAVMNodeDAO.getOrphans(fQueueLength);
if (nodes.size() == 0)
{
if (fgLogger.isTraceEnabled())
{
fgLogger.trace("Nothing to purge (set fActive = false)");
}
fActive = false;
return null;
}
@@ -248,55 +279,80 @@ public class OrphanReaper
fPurgeQueue.add(node.getId());
}
}
if (fgLogger.isDebugEnabled())
{
fgLogger.debug("Found orphan nodes (fpurgeQueue size = "+fPurgeQueue.size()+")");
}
fActive = true;
for (int i = 0; i < fBatchSize; i++)
{
if (fPurgeQueue.size() == 0)
{
if (fgLogger.isDebugEnabled())
{
fgLogger.debug("Purge queue is empty (fpurgeQueue size = "+fPurgeQueue.size()+")");
}
fPurgeQueue = null;
return null;
}
AVMNode node = AVMDAOs.Instance().fAVMNodeDAO.getByID(fPurgeQueue.removeFirst());
Long nodeId = fPurgeQueue.removeFirst();
AVMNode node = AVMDAOs.Instance().fAVMNodeDAO.getByID(nodeId);
if (node == null)
{
// eg. cluster, multiple reapers
fgLogger.warn("Node ["+nodeId+"] not found - assume multiple reapers ...");
continue;
}
// Save away the ancestor and merged from fields from this node.
HistoryLink hlink = AVMDAOs.Instance().fHistoryLinkDAO.getByDescendent(node);
AVMNode ancestor = null;
if (hlink != null)
AVMHistoryLinkEntity hlEntity = AVMDAOs.Instance().newAVMNodeLinksDAO.getHistoryLinkByDescendent(node.getId());
if (hlEntity != null)
{
ancestor = hlink.getAncestor();
AVMDAOs.Instance().fHistoryLinkDAO.delete(hlink);
ancestor = AVMDAOs.Instance().fAVMNodeDAO.getByID(hlEntity.getAncestorNodeId());
AVMDAOs.Instance().newAVMNodeLinksDAO.deleteHistoryLink(hlEntity.getAncestorNodeId(), hlEntity.getDescendentNodeId());
}
MergeLink mlink = AVMDAOs.Instance().fMergeLinkDAO.getByTo(node);
AVMNode mergedFrom = null;
if (mlink != null)
AVMMergeLinkEntity mlEntity = AVMDAOs.Instance().newAVMNodeLinksDAO.getMergeLinkByTo(node.getId());
if (mlEntity != null)
{
mergedFrom = mlink.getMfrom();
AVMDAOs.Instance().fMergeLinkDAO.delete(mlink);
mergedFrom = AVMDAOs.Instance().fAVMNodeDAO.getByID(mlEntity.getMergeFromNodeId());
AVMDAOs.Instance().newAVMNodeLinksDAO.deleteMergeLink(mlEntity.getMergeFromNodeId(), mlEntity.getMergeToNodeId());
}
// Get all the nodes that have this node as ancestor.
List<HistoryLink> links = AVMDAOs.Instance().fHistoryLinkDAO.getByAncestor(node);
for (HistoryLink link : links)
List<AVMHistoryLinkEntity> hlEntities = AVMDAOs.Instance().newAVMNodeLinksDAO.getHistoryLinksByAncestor(node.getId());
for (AVMHistoryLinkEntity link : hlEntities)
{
AVMNode desc = link.getDescendent();
desc.setAncestor(ancestor);
if (desc.getMergedFrom() == null)
AVMNode desc = AVMDAOs.Instance().fAVMNodeDAO.getByID(link.getDescendentNodeId());
if (desc != null)
{
desc.setMergedFrom(mergedFrom);
desc.setAncestor(ancestor);
if (desc.getMergedFrom() == null)
{
desc.setMergedFrom(mergedFrom);
}
}
AVMDAOs.Instance().fHistoryLinkDAO.delete(link);
AVMDAOs.Instance().newAVMNodeLinksDAO.deleteHistoryLink(link.getAncestorNodeId(), link.getDescendentNodeId());
}
// Get all the nodes that have this node as mergedFrom
List<MergeLink> mlinks = AVMDAOs.Instance().fMergeLinkDAO.getByFrom(node);
for (MergeLink link : mlinks)
List<AVMMergeLinkEntity> mlEntities = AVMDAOs.Instance().newAVMNodeLinksDAO.getMergeLinksByFrom(node.getId());
for (AVMMergeLinkEntity link : mlEntities)
{
link.getMto().setMergedFrom(ancestor);
AVMDAOs.Instance().fMergeLinkDAO.delete(link);
AVMNode mto = AVMDAOs.Instance().fAVMNodeDAO.getByID(link.getMergeToNodeId());
if (mto != null)
{
mto.setMergedFrom(ancestor);
}
AVMDAOs.Instance().newAVMNodeLinksDAO.deleteMergeLink(link.getMergeFromNodeId(), link.getMergeToNodeId());
}
// Get rid of all properties belonging to this node.
@@ -334,6 +390,11 @@ public class OrphanReaper
}
// Finally, delete it
AVMDAOs.Instance().fAVMNodeDAO.delete(node);
if (fgLogger.isTraceEnabled())
{
fgLogger.trace("Deleted Node ["+node.getId()+"]");
}
}
return null;
}

View File

@@ -20,6 +20,8 @@ package org.alfresco.repo.avm;
import org.alfresco.error.AlfrescoRuntimeException;
import org.alfresco.repo.avm.util.BulkLoader;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
/**
* Test the purge thread.
@@ -27,6 +29,46 @@ import org.alfresco.repo.avm.util.BulkLoader;
*/
public class PurgeTestP extends AVMServiceTestBase
{
private static Log logger = LogFactory.getLog(PurgeTestP.class);
public void testSetup()
{
runOrphanReaper();
}
public void testRemoveNodes() throws Throwable
{
try
{
logger.info("testRemoveNodes");
runOrphanReaper();
int fileCount = 10;
logger.info("Create "+fileCount+" files ...");
for (int i = 1; i <= fileCount; i++)
{
fService.createFile("main:/", "file"+i);
}
logger.info("Remove "+fileCount+" files ...");
for (int i = 1; i <= fileCount; i++)
{
fService.removeNode("main:/", "file"+i);
}
runOrphanReaper();
}
catch (Exception e)
{
e.printStackTrace(System.err);
throw e;
}
}
/**
* Test purging a version.
*/
@@ -34,6 +76,10 @@ public class PurgeTestP extends AVMServiceTestBase
{
try
{
logger.info("testPurgeVersion");
runOrphanReaper();
setupBasicTree();
BulkLoader loader = new BulkLoader();
loader.setAvmService(fService);
@@ -44,9 +90,9 @@ public class PurgeTestP extends AVMServiceTestBase
loader.recursiveLoad("source/java/org/alfresco/repo/avm", "main:/");
System.err.println("Load time: " + (System.currentTimeMillis() - start) + "ms");
logger.info("Load time: " + (System.currentTimeMillis() - start) + "ms");
fService.createSnapshot("main", null, null);
System.err.println("Load time + snapshot: " + (System.currentTimeMillis() - start) + "ms");
logger.info("Load time + snapshot: " + (System.currentTimeMillis() - start) + "ms");
fService.purgeVersion(2, "main");
runOrphanReaper();
@@ -65,6 +111,10 @@ public class PurgeTestP extends AVMServiceTestBase
{
try
{
logger.info("testPurgeOlderVersion");
runOrphanReaper();
setupBasicTree();
BulkLoader loader = new BulkLoader();
loader.setAvmService(fService);
@@ -75,9 +125,9 @@ public class PurgeTestP extends AVMServiceTestBase
loader.recursiveLoad("source/java/org/alfresco/repo/avm", "main:/");
System.err.println("Load time: " + (System.currentTimeMillis() - start) + "ms");
logger.info("Load time: " + (System.currentTimeMillis() - start) + "ms");
fService.createSnapshot("main", null, null);
System.err.println("Load time + snapshot: " + (System.currentTimeMillis() - start) + "ms");
logger.info("Load time + snapshot: " + (System.currentTimeMillis() - start) + "ms");
//fService.removeNode("main:/source/java/org/alfresco", "repo");
@@ -103,7 +153,12 @@ public class PurgeTestP extends AVMServiceTestBase
{
try
{
logger.info("testPurgeStore");
runOrphanReaper();
setupBasicTree();
BulkLoader loader = new BulkLoader();
loader.setAvmService(fService);
long start = System.currentTimeMillis();
@@ -113,9 +168,9 @@ public class PurgeTestP extends AVMServiceTestBase
loader.recursiveLoad("source/java/org/alfresco/repo/avm", "main:/");
System.err.println("Load time: " + (System.currentTimeMillis() - start) + "ms");
logger.info("Load time: " + (System.currentTimeMillis() - start) + "ms");
fService.createSnapshot("main", null, null);
System.err.println("Load time + snapshot: " + (System.currentTimeMillis() - start) + "ms");
logger.info("Load time + snapshot: " + (System.currentTimeMillis() - start) + "ms");
//fService.createLayeredDirectory("main:/source", "main:/", "layer");
@@ -125,8 +180,8 @@ public class PurgeTestP extends AVMServiceTestBase
fService.removeNode("main:/layer", "actions");
fService.createFile("main:/layer", "goofy").close();
fService.createSnapshot("main", null, null);
fService.purgeStore("main");
runOrphanReaper();
@@ -140,7 +195,10 @@ public class PurgeTestP extends AVMServiceTestBase
private void runOrphanReaper()
{
logger.info("Reaper started");
fReaper.activate();
fReaper.execute();
final int maxCycles = 100;
@@ -149,12 +207,13 @@ public class PurgeTestP extends AVMServiceTestBase
{
try
{
System.out.print(".");
//System.out.print(".");
Thread.sleep(2000);
}
catch (InterruptedException e)
{
// Do nothing.
logger.warn("OrphanReaper was interrupted - do nothing: "+e);
}
cycles++;
@@ -165,6 +224,6 @@ public class PurgeTestP extends AVMServiceTestBase
throw new AlfrescoRuntimeException("Orphan reaper still active - failed to clean orphans in "+cycles+" cycles (max "+maxCycles+")");
}
System.out.println("\nReaper finished (in "+cycles+" cycles)");
logger.info("Reaper finished (in "+cycles+" cycles)");
}
}

View File

@@ -34,19 +34,19 @@ import org.alfresco.repo.avm.BasicAttributesImpl;
import org.alfresco.repo.avm.DeletedNode;
import org.alfresco.repo.avm.DeletedNodeImpl;
import org.alfresco.repo.avm.DirectoryNode;
import org.alfresco.repo.avm.HistoryLink;
import org.alfresco.repo.avm.Layered;
import org.alfresco.repo.avm.LayeredDirectoryNode;
import org.alfresco.repo.avm.LayeredDirectoryNodeImpl;
import org.alfresco.repo.avm.LayeredFileNode;
import org.alfresco.repo.avm.LayeredFileNodeImpl;
import org.alfresco.repo.avm.MergeLink;
import org.alfresco.repo.avm.PlainDirectoryNode;
import org.alfresco.repo.avm.PlainDirectoryNodeImpl;
import org.alfresco.repo.avm.PlainFileNode;
import org.alfresco.repo.avm.PlainFileNodeImpl;
import org.alfresco.repo.domain.DbAccessControlList;
import org.alfresco.repo.domain.PropertyValue;
import org.alfresco.repo.domain.avm.AVMHistoryLinkEntity;
import org.alfresco.repo.domain.avm.AVMMergeLinkEntity;
import org.alfresco.repo.domain.avm.AVMNodeEntity;
import org.alfresco.repo.domain.avm.AVMVersionRootEntity;
import org.alfresco.service.namespace.QName;
@@ -208,12 +208,12 @@ class AVMNodeDAOIbatis implements AVMNodeDAO
*/
public AVMNode getAncestor(AVMNode descendent)
{
HistoryLink hl = AVMDAOs.Instance().fHistoryLinkDAO.getByDescendent(descendent);
if (hl == null)
AVMHistoryLinkEntity hlEntity = AVMDAOs.Instance().newAVMNodeLinksDAO.getHistoryLinkByDescendent(descendent.getId());
if (hlEntity == null)
{
return null;
}
return hl.getAncestor();
return AVMDAOs.Instance().fAVMNodeDAO.getByID(hlEntity.getAncestorNodeId());
}
/* (non-Javadoc)
@@ -221,12 +221,12 @@ class AVMNodeDAOIbatis implements AVMNodeDAO
*/
public AVMNode getMergedFrom(AVMNode mTo)
{
MergeLink ml = AVMDAOs.Instance().fMergeLinkDAO.getByTo(mTo);
if (ml == null)
AVMMergeLinkEntity mlEntity = AVMDAOs.Instance().newAVMNodeLinksDAO.getMergeLinkByTo(mTo.getId());
if (mlEntity == null)
{
return null;
}
return ml.getMfrom();
return AVMDAOs.Instance().fAVMNodeDAO.getByID(mlEntity.getMergeFromNodeId());
}
/* (non-Javadoc)

View File

@@ -1,95 +0,0 @@
/*
* Copyright (C) 2005-2010 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>. */
package org.alfresco.repo.avm.ibatis;
import java.util.ArrayList;
import java.util.List;
import org.alfresco.repo.avm.AVMDAOs;
import org.alfresco.repo.avm.AVMNode;
import org.alfresco.repo.avm.HistoryLink;
import org.alfresco.repo.avm.HistoryLinkDAO;
import org.alfresco.repo.avm.HistoryLinkImpl;
import org.alfresco.repo.domain.avm.AVMHistoryLinkEntity;
/**
* iBATIS DAO wrapper for HistoryLink
*
* @author janv
*/
class HistoryLinkDAOIbatis implements HistoryLinkDAO
{
/* (non-Javadoc)
* @see org.alfresco.repo.avm.HistoryLinkDAO#save(org.alfresco.repo.avm.HistoryLink)
*/
public void save(HistoryLink link)
{
AVMDAOs.Instance().newAVMNodeLinksDAO.createHistoryLink(link.getAncestor().getId(), link.getDescendent().getId());
}
/* (non-Javadoc)
* @see org.alfresco.repo.avm.HistoryLinkDAO#getByDescendent(org.alfresco.repo.avm.AVMNode)
*/
public HistoryLink getByDescendent(AVMNode descendent)
{
AVMHistoryLinkEntity hlEntity = AVMDAOs.Instance().newAVMNodeLinksDAO.getHistoryLinkByDescendent(descendent.getId());
if (hlEntity == null)
{
return null;
}
AVMNode ancestor = AVMDAOs.Instance().fAVMNodeDAO.getByID(hlEntity.getAncestorNodeId());
HistoryLink hl = new HistoryLinkImpl();
hl.setAncestor(ancestor);
hl.setDescendent(descendent);
return hl;
}
/* (non-Javadoc)
* @see org.alfresco.repo.avm.HistoryLinkDAO#getByAncestor(org.alfresco.repo.avm.AVMNode)
*/
public List<HistoryLink> getByAncestor(AVMNode ancestor)
{
List<AVMHistoryLinkEntity> hlEntities = AVMDAOs.Instance().newAVMNodeLinksDAO.getHistoryLinksByAncestor(ancestor.getId());
List<HistoryLink> hls = new ArrayList<HistoryLink>(hlEntities.size());
for (AVMHistoryLinkEntity hlEntity : hlEntities)
{
AVMNode descendent = AVMDAOs.Instance().fAVMNodeDAO.getByID(hlEntity.getDescendentNodeId());
HistoryLink hl = new HistoryLinkImpl();
hl.setAncestor(ancestor);
hl.setDescendent(descendent);
hls.add(hl);
}
return hls;
}
/* (non-Javadoc)
* @see org.alfresco.repo.avm.HistoryLinkDAO#delete(org.alfresco.repo.avm.HistoryLink)
*/
public void delete(HistoryLink link)
{
AVMDAOs.Instance().newAVMNodeLinksDAO.deleteHistoryLink(link.getAncestor().getId(), link.getDescendent().getId());
}
}

View File

@@ -1,95 +0,0 @@
/*
* Copyright (C) 2005-2010 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>. */
package org.alfresco.repo.avm.ibatis;
import java.util.ArrayList;
import java.util.List;
import org.alfresco.repo.avm.AVMDAOs;
import org.alfresco.repo.avm.AVMNode;
import org.alfresco.repo.avm.MergeLink;
import org.alfresco.repo.avm.MergeLinkDAO;
import org.alfresco.repo.avm.MergeLinkImpl;
import org.alfresco.repo.domain.avm.AVMMergeLinkEntity;
/**
* iBATIS DAO wrapper for MergeLink
*
* @author janv
*/
class MergeLinkDAOIbatis implements MergeLinkDAO
{
/* (non-Javadoc)
* @see org.alfresco.repo.avm.MergeLinkDAO#save(org.alfresco.repo.avm.MergeLink)
*/
public void save(MergeLink link)
{
AVMDAOs.Instance().newAVMNodeLinksDAO.createMergeLink(link.getMfrom().getId(), link.getMto().getId());
}
/* (non-Javadoc)
* @see org.alfresco.repo.avm.MergeLinkDAO#getByTo(org.alfresco.repo.avm.AVMNode)
*/
public MergeLink getByTo(AVMNode to)
{
AVMMergeLinkEntity mlEntity = AVMDAOs.Instance().newAVMNodeLinksDAO.getMergeLinkByTo(to.getId());
if (mlEntity == null)
{
return null;
}
AVMNode from = AVMDAOs.Instance().fAVMNodeDAO.getByID(mlEntity.getMergeFromNodeId());
MergeLink ml = new MergeLinkImpl();
ml.setMfrom(from);
ml.setMto(to);
return ml;
}
/* (non-Javadoc)
* @see org.alfresco.repo.avm.MergeLinkDAO#getByFrom(org.alfresco.repo.avm.AVMNode)
*/
public List<MergeLink> getByFrom(AVMNode from)
{
List<AVMMergeLinkEntity> mlEntities = AVMDAOs.Instance().newAVMNodeLinksDAO.getMergeLinksByFrom(from.getId());
List<MergeLink> mls = new ArrayList<MergeLink>(mlEntities.size());
for (AVMMergeLinkEntity mlEntity : mlEntities)
{
AVMNode to = AVMDAOs.Instance().fAVMNodeDAO.getByID(mlEntity.getMergeToNodeId());
MergeLink ml = new MergeLinkImpl();
ml.setMfrom(from);
ml.setMto(to);
mls.add(ml);
}
return mls;
}
/* (non-Javadoc)
* @see org.alfresco.repo.avm.MergeLinkDAO#delete(org.alfresco.repo.avm.MergeLink)
*/
public void delete(MergeLink link)
{
AVMDAOs.Instance().newAVMNodeLinksDAO.deleteMergeLink(link.getMfrom().getId(), link.getMto().getId());
}
}

View File

@@ -856,6 +856,11 @@ abstract public class AbstractMappingMetadataExtracter implements MetadataExtrac
*/
protected Date makeDate(String dateStr)
{
if (dateStr == null || dateStr.length() == 0)
{
return null;
}
Date date = null;
try
{
@@ -885,9 +890,8 @@ abstract public class AbstractMappingMetadataExtracter implements MetadataExtrac
}
/**
* Adds a value to the map if it is non-trivial. A value is trivial if
* Adds a value to the map, conserving null values. Values are converted to null if:
* <ul>
* <li>it is null</li>
* <li>it is an empty string value after trimming</li>
* <li>it is an empty collection</li>
* <li>it is an empty array</li>
@@ -907,14 +911,14 @@ abstract public class AbstractMappingMetadataExtracter implements MetadataExtrac
{
if (value == null)
{
return false;
// Just keep this
}
if (value instanceof String)
else if (value instanceof String)
{
String valueStr = ((String) value).trim();
if (valueStr.length() == 0)
{
return false;
value = null;
}
else
{
@@ -927,14 +931,14 @@ abstract public class AbstractMappingMetadataExtracter implements MetadataExtrac
Collection valueCollection = (Collection) value;
if (valueCollection.isEmpty())
{
return false;
value = null;
}
}
else if (value.getClass().isArray())
{
if (Array.getLength(value) == 0)
{
return false;
value = null;
}
}
// It passed all the tests

View File

@@ -52,6 +52,7 @@ public interface MetadataExtracter extends ContentWorker
* <ul>
* <li>the extracted property is not null</li>
* </ul>
* <tt>null</tt> extracted values are return in the 'modified' map.
*/
EAGER
{
@@ -64,11 +65,10 @@ public interface MetadataExtracter extends ContentWorker
QName propertyQName = entry.getKey();
Serializable extractedValue = entry.getValue();
// Ignore null extracted value
if (extractedValue == null)
if (extractedValue != null)
{
continue;
targetProperties.put(propertyQName, extractedValue);
}
targetProperties.put(propertyQName, extractedValue);
modifiedProperties.put(propertyQName, extractedValue);
}
return modifiedProperties;
@@ -82,6 +82,7 @@ public interface MetadataExtracter extends ContentWorker
* <li>the target value is null</li>
* <li>the string representation of the target value is an empty string</li>
* </ul>
* <tt>null</tt> extracted values are return in the 'modified' map.
*/
PRAGMATIC
{
@@ -99,6 +100,7 @@ public interface MetadataExtracter extends ContentWorker
// Ignore null extracted value
if (extractedValue == null)
{
modifiedProperties.put(propertyQName, extractedValue);
continue;
}
// Handle the shortcut cases where the target value is missing or null
@@ -148,6 +150,7 @@ public interface MetadataExtracter extends ContentWorker
* <li>the extracted property is not null</li>
* <li>there is no target key for the property</li>
* </ul>
* <tt>null</tt> extracted values are return in the 'modified' map.
*/
CAUTIOUS
{
@@ -162,6 +165,7 @@ public interface MetadataExtracter extends ContentWorker
// Ignore null extracted value
if (extractedValue == null)
{
modifiedProperties.put(propertyQName, extractedValue);
continue;
}
// Is the key present in the target values
@@ -181,8 +185,8 @@ public interface MetadataExtracter extends ContentWorker
* Apply the overwrite policy for the extracted properties.
*
* @return
* Returns a map of all properties that were applied to the target map. If the result is
* an empty map, then the target map remains unchanged.
* Returns a map of all properties that were applied to the target map
* as well as any null values that weren't applied but were present.
*/
public Map<QName, Serializable> applyProperties(Map<QName, Serializable> extractedProperties, Map<QName, Serializable> targetProperties)
{

View File

@@ -24,14 +24,16 @@ import org.alfresco.service.namespace.QName;
/**
* Contract disabling and enabling policy behaviours.
*
* @See org.alfresco.repo.policy.PolicyComponent
*
* @author David Caruana
*/
public interface BehaviourFilter
{
/**
* Disable behaviour for all nodes.
* Disable behaviour for a type or aspect for all nodes.
* <p>
* The change applies <b>ONLY</b> to the current trasaction.
* The change applies <b>ONLY</b> to the current transaction.
*
* @param className the type/aspect behaviour to disable
* @return true => already disabled
@@ -41,7 +43,7 @@ public interface BehaviourFilter
/**
* Disable behaviour for specific node
* <p>
* The change applies <b>ONLY</b> to the current trasaction.
* The change applies <b>ONLY</b> to the current transaction.
*
* @param nodeRef the node to disable for
* @param className the type/aspect behaviour to disable
@@ -52,7 +54,7 @@ public interface BehaviourFilter
/**
* Enable behaviour for all nodes
* <p>
* The change applies <b>ONLY</b> to the current trasaction.
* The change applies <b>ONLY</b> to the current transaction.
*
* @param className the type/aspect behaviour to enable
*/
@@ -61,7 +63,7 @@ public interface BehaviourFilter
/**
* Enable behaviour for specific node
* <p>
* The change applies <b>ONLY</b> to the current trasaction.
* The change applies <b>ONLY</b> to the current transaction.
*
* @param nodeRef the node to enable for
* @param className the type/aspect behaviour to enable
@@ -71,24 +73,37 @@ public interface BehaviourFilter
/**
* Enable all behaviours for specific node
* <p>
* The change applies <b>ONLY</b> to the current trasaction.
* The change applies <b>ONLY</b> to the current transaction.
*
* @param nodeRef the node to enable for
*/
public void enableBehaviours(NodeRef nodeRef);
/**
* Enable all behaviours i.e. undo all disable calls - both at the
* Disable all behaviours. Once this method is called the node and class level filters, enableBehaviours and disableBehaviours
* methods have no effect, every behaviour is disabled.
* EnableAllBehaviours reverses the result of calling this method.
* <p>
* Calling this method may result in nodes existing in your repository that do not conform to your policies.
*
* <p>
* The change applies <b>ONLY</b> to the current transaction.
* @see #enableAllBehaviours
*/
public void disableAllBehaviours();
/**
* Enable all behaviours i.e. undo all disable calls - at the global,
* node and class level.
* <p>
* The change applies <b>ONLY</b> to the current trasaction.
* The change applies <b>ONLY</b> to the current transaction.
*/
public void enableAllBehaviours();
/**
* Determine if behaviour is enabled across all nodes.
* <p>
* The change applies <b>ONLY</b> to the current trasaction.
* The change applies <b>ONLY</b> to the current transaction.
*
* @param className the behaviour to test for
* @return true => behaviour is enabled
@@ -102,7 +117,7 @@ public interface BehaviourFilter
* a) the behaviour is not disabled across all nodes
* b) the behaviour is not disabled specifically for the provided node
* <p>
* The change applies <b>ONLY</b> to the current trasaction.
* The change applies <b>ONLY</b> to the current transaction.
*
* @param nodeRef the node to test for
* @param className the behaviour to test for
@@ -113,7 +128,7 @@ public interface BehaviourFilter
/**
* Determine if any behaviours have been disabled?
* <p>
* The change applies <b>ONLY</b> to the current trasaction.
* The change applies <b>ONLY</b> to the current transaction.
*
* @return true => behaviours have been filtered
*/

View File

@@ -37,6 +37,7 @@ import org.alfresco.service.namespace.QName;
*/
public class BehaviourFilterImpl implements BehaviourFilter
{
private static final String KEY_GLOBAL_FILTER = "BehaviourFilterImpl.gloalFilter";
private static final String KEY_CLASS_FILTER = "BehaviourFilterImpl.classFilter";
private static final String KEY_NODEREF_FILTER = "BehaviourFilterImpl.nodeRefFilter";
@@ -122,14 +123,26 @@ public class BehaviourFilterImpl implements BehaviourFilter
nodeRefFilters.remove(nodeRef);
}
public void disableAllBehaviours()
{
TransactionalResourceHelper.setBoolean(KEY_GLOBAL_FILTER);
}
public void enableAllBehaviours()
{
TransactionalResourceHelper.resetBoolean(KEY_GLOBAL_FILTER);
Map<NodeRef,List<QName>> filters = TransactionalResourceHelper.getMap(KEY_NODEREF_FILTER);
filters.clear();
}
public boolean isEnabled(NodeRef nodeRef, QName className)
{
if(TransactionalResourceHelper.testBoolean(KEY_GLOBAL_FILTER))
{
return false;
}
// check global filters
if (!isEnabled(className))
{
@@ -163,6 +176,11 @@ public class BehaviourFilterImpl implements BehaviourFilter
public boolean isEnabled(QName className)
{
if(TransactionalResourceHelper.testBoolean(KEY_GLOBAL_FILTER))
{
return false;
}
// check global class filters
List<QName> classFilters = TransactionalResourceHelper.getList(KEY_CLASS_FILTER);
boolean filtered = classFilters.contains(className);
@@ -186,6 +204,8 @@ public class BehaviourFilterImpl implements BehaviourFilter
{
List<QName> classFilters = TransactionalResourceHelper.getList(KEY_CLASS_FILTER);
Map<NodeRef,List<QName>> nodeRefFilters = TransactionalResourceHelper.getMap(KEY_NODEREF_FILTER);
return (!classFilters.isEmpty()) || (!nodeRefFilters.isEmpty());
boolean globalFlag = TransactionalResourceHelper.testBoolean(KEY_GLOBAL_FILTER);
return ((!classFilters.isEmpty()) || (!nodeRefFilters.isEmpty()) || globalFlag);
}
}

View File

@@ -38,6 +38,9 @@ import org.alfresco.service.namespace.QName;
* this case, the behaviour is not validated (i.e. checked to determine if it
* supports the policy interface) until the Policy is registered. Otherwise,
* the behaviour is validated at bind-time.
*
* @see org.alfresco.repo.policy.BehaviourFilter
*
*
* @author David Caruana
*

View File

@@ -35,6 +35,8 @@ import org.alfresco.service.cmr.rule.RuleService;
import org.alfresco.service.namespace.NamespaceService;
import org.alfresco.service.namespace.QName;
import org.alfresco.util.PropertyCheck;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
/**
* Class containing behaviour for the rules aspect
@@ -50,6 +52,8 @@ public class RulesAspect implements
private RuleService ruleService;
private NodeService nodeService;
private static Log logger = LogFactory.getLog(RulesAspect.class);
public void setPolicyComponent(PolicyComponent policyComponent)
{
this.policyComponent = policyComponent;
@@ -85,7 +89,6 @@ public class RulesAspect implements
QName.createQName(NamespaceService.ALFRESCO_URI, "onCopyComplete"),
RuleModel.ASPECT_RULES,
new JavaBehaviour(this, "onCopyComplete"));
this.policyComponent.bindClassBehaviour(
QName.createQName(NamespaceService.ALFRESCO_URI, "onAddAspect"),
RuleModel.ASPECT_RULES,
@@ -103,6 +106,10 @@ public class RulesAspect implements
int count = this.nodeService.getChildAssocs(nodeRef, RuleModel.ASSOC_RULE_FOLDER, RuleModel.ASSOC_RULE_FOLDER).size();
if (count == 0)
{
if(logger.isDebugEnabled())
{
logger.debug("rules folder does not exist: create new rules folder for: " + nodeRef);
}
this.nodeService.createNode(
nodeRef,
RuleModel.ASSOC_RULE_FOLDER,

View File

@@ -49,7 +49,8 @@ public class LDAPInitialDirContextFactoryImpl implements LDAPInitialDirContextFa
{
private static final Log logger = LogFactory.getLog(LDAPInitialDirContextFactoryImpl.class);
private Map<String, String> initialDirContextEnvironment = Collections.<String, String> emptyMap();
private Map<String, String> defaultEnvironment = Collections.<String, String> emptyMap();
private Map<String, String> authenticatedEnvironment = Collections.<String, String> emptyMap();
static
{
@@ -63,13 +64,18 @@ public class LDAPInitialDirContextFactoryImpl implements LDAPInitialDirContextFa
public void setInitialDirContextEnvironment(Map<String, String> initialDirContextEnvironment)
{
this.initialDirContextEnvironment = initialDirContextEnvironment;
this.authenticatedEnvironment = initialDirContextEnvironment;
}
public Map<String, String> getInitialDirContextEnvironment()
{
return initialDirContextEnvironment;
return authenticatedEnvironment;
}
public void setDefaultIntialDirContextEnvironment(Map<String, String> defaultEnvironment)
{
this.defaultEnvironment = defaultEnvironment;
}
public InitialDirContext getDefaultIntialDirContext() throws AuthenticationException
{
@@ -78,10 +84,8 @@ public class LDAPInitialDirContextFactoryImpl implements LDAPInitialDirContextFa
public InitialDirContext getDefaultIntialDirContext(int pageSize) throws AuthenticationException
{
Hashtable<String, String> env = new Hashtable<String, String>(initialDirContextEnvironment.size());
env.putAll(initialDirContextEnvironment);
env.put("javax.security.auth.useSubjectCredsOnly", "false");
env.put("com.sun.jndi.ldap.connect.pool", "true"); // Pool the default connection
Hashtable<String, String> env = new Hashtable<String, String>(defaultEnvironment.size());
env.putAll(defaultEnvironment);
return buildInitialDirContext(env, pageSize);
}
@@ -185,8 +189,8 @@ public class LDAPInitialDirContextFactoryImpl implements LDAPInitialDirContextFa
throw new AuthenticationException("Empty credentials provided.");
}
Hashtable<String, String> env = new Hashtable<String, String>(initialDirContextEnvironment.size());
env.putAll(initialDirContextEnvironment);
Hashtable<String, String> env = new Hashtable<String, String>(authenticatedEnvironment.size());
env.putAll(authenticatedEnvironment);
env.put(Context.SECURITY_PRINCIPAL, principal);
env.put(Context.SECURITY_CREDENTIALS, credentials);
@@ -284,8 +288,8 @@ public class LDAPInitialDirContextFactoryImpl implements LDAPInitialDirContextFa
{
// Check Anonymous bind
Hashtable<String, String> env = new Hashtable<String, String>(initialDirContextEnvironment.size());
env.putAll(initialDirContextEnvironment);
Hashtable<String, String> env = new Hashtable<String, String>(authenticatedEnvironment.size());
env.putAll(authenticatedEnvironment);
env.remove(Context.SECURITY_PRINCIPAL);
env.remove(Context.SECURITY_CREDENTIALS);
try
@@ -310,8 +314,8 @@ public class LDAPInitialDirContextFactoryImpl implements LDAPInitialDirContextFa
// Simple DN and password
env = new Hashtable<String, String>(initialDirContextEnvironment.size());
env.putAll(initialDirContextEnvironment);
env = new Hashtable<String, String>(authenticatedEnvironment.size());
env.putAll(authenticatedEnvironment);
env.put(Context.SECURITY_PRINCIPAL, "daftAsABrush");
env.put(Context.SECURITY_CREDENTIALS, "daftAsABrush");
try
@@ -339,8 +343,8 @@ public class LDAPInitialDirContextFactoryImpl implements LDAPInitialDirContextFa
// DN and password
env = new Hashtable<String, String>(initialDirContextEnvironment.size());
env.putAll(initialDirContextEnvironment);
env = new Hashtable<String, String>(authenticatedEnvironment.size());
env.putAll(authenticatedEnvironment);
env.put(Context.SECURITY_PRINCIPAL, "cn=daftAsABrush,dc=woof");
env.put(Context.SECURITY_CREDENTIALS, "daftAsABrush");
try
@@ -368,14 +372,14 @@ public class LDAPInitialDirContextFactoryImpl implements LDAPInitialDirContextFa
// Check more if we have a real principal we expect to work
env = new Hashtable<String, String>(initialDirContextEnvironment.size());
env.putAll(initialDirContextEnvironment);
env = new Hashtable<String, String>(defaultEnvironment.size());
env.putAll(defaultEnvironment);
if (env.get(Context.SECURITY_PRINCIPAL) != null)
{
// Correct principal invalid password
env = new Hashtable<String, String>(initialDirContextEnvironment.size());
env.putAll(initialDirContextEnvironment);
env = new Hashtable<String, String>(defaultEnvironment.size());
env.putAll(defaultEnvironment);
env.put(Context.SECURITY_CREDENTIALS, "sdasdasdasdasd123123123");
try
{

View File

@@ -34,6 +34,9 @@ import java.util.Map;
import java.util.Set;
import java.util.TreeMap;
import java.util.TreeSet;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.ScheduledThreadPoolExecutor;
import java.util.concurrent.TimeUnit;
import org.alfresco.model.ContentModel;
import org.alfresco.repo.batch.BatchProcessor;
@@ -58,6 +61,7 @@ import org.alfresco.service.namespace.NamespaceService;
import org.alfresco.service.namespace.QName;
import org.alfresco.service.transaction.TransactionService;
import org.alfresco.util.PropertyMap;
import org.alfresco.util.TraceableThreadFactory;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.NoSuchBeanDefinitionException;
@@ -103,8 +107,8 @@ public class ChainingUserRegistrySynchronizer extends AbstractLifecycleBean impl
private static final QName LOCK_QNAME = QName.createQName(NamespaceService.SYSTEM_MODEL_1_0_URI,
"ChainingUserRegistrySynchronizer");
/** The maximum time this lock will be held for (1 day). */
private static final long LOCK_TTL = 1000 * 60 * 60 * 24;
/** The time this lock will persist for in the database (now only 2 minutes but refreshed at regular intervals). */
private static final long LOCK_TTL = 1000 * 60 * 2;
/** The path in the attribute service below which we persist attributes. */
private static final String ROOT_ATTRIBUTE_PATH = ".ChainingUserRegistrySynchronizer";
@@ -315,7 +319,7 @@ public class ChainingUserRegistrySynchronizer extends AbstractLifecycleBean impl
* (non-Javadoc)
* @see org.alfresco.repo.security.sync.UserRegistrySynchronizer#synchronize(boolean, boolean, boolean)
*/
public void synchronize(boolean forceUpdate, boolean allowDeletions, boolean splitTxns)
public void synchronize(boolean forceUpdate, boolean allowDeletions, final boolean splitTxns)
{
// Don't proceed with the sync if the repository is read only
if (this.transactionService.isReadOnly())
@@ -325,8 +329,15 @@ public class ChainingUserRegistrySynchronizer extends AbstractLifecycleBean impl
return;
}
// Create a background executor that will refresh our lock. This means we can request a lock with a relatively
// small persistence time and not worry about it lasting after server restarts. Note we use an independent
// executor because this is a compound operation that spans accross multiple batch processors.
String lockToken = null;
TraceableThreadFactory threadFactory = new TraceableThreadFactory();
threadFactory.setNamePrefix("ChainingUserRegistrySynchronizer lock refresh");
threadFactory.setThreadDaemon(true);
ScheduledExecutorService lockRefresher = new ScheduledThreadPoolExecutor(1, threadFactory);
// Let's ensure all exceptions get logged
try
{
@@ -351,7 +362,7 @@ public class ChainingUserRegistrySynchronizer extends AbstractLifecycleBean impl
else
{
// If this is a login-triggered sync, give it a few retries before giving up
this.jobLockService.getTransactionalLock(ChainingUserRegistrySynchronizer.LOCK_QNAME,
lockToken = this.jobLockService.getLock(ChainingUserRegistrySynchronizer.LOCK_QNAME,
ChainingUserRegistrySynchronizer.LOCK_TTL, 3000, 10);
}
}
@@ -363,6 +374,27 @@ public class ChainingUserRegistrySynchronizer extends AbstractLifecycleBean impl
return;
}
// Schedule the lock refresh to run at regular intervals
final String token = lockToken;
lockRefresher.scheduleAtFixedRate(new Runnable()
{
public void run()
{
ChainingUserRegistrySynchronizer.this.transactionService.getRetryingTransactionHelper()
.doInTransaction(new RetryingTransactionCallback<Object>()
{
public Object execute() throws Throwable
{
ChainingUserRegistrySynchronizer.this.jobLockService.refreshLock(token,
ChainingUserRegistrySynchronizer.LOCK_QNAME,
ChainingUserRegistrySynchronizer.LOCK_TTL);
return null;
}
}, false, splitTxns);
}
}, ChainingUserRegistrySynchronizer.LOCK_TTL / 2, ChainingUserRegistrySynchronizer.LOCK_TTL / 2,
TimeUnit.MILLISECONDS);
Set<String> visitedZoneIds = new TreeSet<String>();
Collection<String> instanceIds = this.applicationContextManager.getInstanceIds();
@@ -418,6 +450,16 @@ public class ChainingUserRegistrySynchronizer extends AbstractLifecycleBean impl
{
if (lockToken != null)
{
// Cancel the lock refresher
lockRefresher.shutdown();
try
{
lockRefresher.awaitTermination(Long.MAX_VALUE, TimeUnit.SECONDS);
}
catch (InterruptedException e)
{
}
final String token = lockToken;
this.transactionService.getRetryingTransactionHelper().doInTransaction(
new RetryingTransactionCallback<Object>()
@@ -888,8 +930,43 @@ public class ChainingUserRegistrySynchronizer extends AbstractLifecycleBean impl
// Remove all the associations we have already dealt with
this.groupAssocsToCreate.keySet().removeAll(this.authoritiesMaintained);
// Filter out associations to authorities that simply can't exist
this.groupAssocsToCreate.keySet().retainAll(this.allZoneAuthorities);
// Filter out associations to authorities that simply can't exist (and log if debugging is enabled)
Iterator<Map.Entry<String, Set<String>>> i = this.groupAssocsToCreate.entrySet().iterator();
StringBuilder groupList = null;
while (i.hasNext())
{
Map.Entry<String, Set<String>> entry = i.next();
String child = entry.getKey();
if (!this.allZoneAuthorities.contains(child))
{
if (ChainingUserRegistrySynchronizer.logger.isDebugEnabled())
{
if (groupList == null)
{
groupList = new StringBuilder(1024);
}
else
{
groupList.setLength(0);
}
for (String parent : entry.getValue())
{
if (groupList.length() > 0)
{
groupList.append(", ");
}
groupList.append('\'').append(
ChainingUserRegistrySynchronizer.this.authorityService.getShortName(parent))
.append('\'');
}
ChainingUserRegistrySynchronizer.logger.debug("Ignoring non-existent member '"
+ ChainingUserRegistrySynchronizer.this.authorityService.getShortName(child)
+ "' in groups {" + groupList.toString() + "}");
}
i.remove();
}
}
if (!this.groupAssocsToCreate.isEmpty())
{

View File

@@ -107,4 +107,52 @@ public abstract class TransactionalResourceHelper
}
return list;
}
/**
* Support method to set a boolean (true) value in the current transaction.
* @param resourceKey the key under which the resource will be stored
* @return true - the value of resourceKey, was set to true, false - the value was already true
*/
public static final boolean setBoolean(Object resourceKey)
{
Boolean value = AlfrescoTransactionSupport.getResource(resourceKey);
if(value == null)
{
AlfrescoTransactionSupport.bindResource(resourceKey, Boolean.TRUE);
return true;
}
return false;
}
/**
* Support method to reset (make false) a boolean value in the current transaction.
* @param resourceKey the key under which the resource is stored.
*/
public static final void resetBoolean(Object resourceKey)
{
Boolean value = AlfrescoTransactionSupport.getResource(resourceKey);
if(value == null)
{
AlfrescoTransactionSupport.unbindResource(resourceKey);
}
}
/**
* Is there a boolean value in the current transaction
* @param resourceKey the key under which the resource will be stored
* @return true - thre is, false no.
*/
public static final boolean testBoolean(Object resourceKey)
{
Boolean value = AlfrescoTransactionSupport.getResource(resourceKey);
if(value == null)
{
return false;
}
else
{
return true;
}
}
}

View File

@@ -26,17 +26,30 @@ import java.util.HashSet;
import java.util.List;
import java.util.Set;
import org.alfresco.model.ContentModel;
import org.alfresco.service.cmr.dictionary.AssociationDefinition;
import org.alfresco.service.cmr.dictionary.DictionaryService;
import org.alfresco.service.cmr.repository.ChildAssociationRef;
import org.alfresco.service.cmr.repository.NodeRef;
import org.alfresco.service.cmr.repository.NodeService;
import org.alfresco.service.cmr.transfer.NodeCrawler;
import org.alfresco.service.namespace.QName;
import org.alfresco.service.namespace.RegexQNamePattern;
/**
* @author brian
*
* A node finder that searches for child nodes with the association specified.
*
* For example, could be used to find all children with the cm:contains relationship.
*
<pre>
NodeCrawler crawler = nodeCrawlerFactory.getNodeCrawler();
crawler.setNodeFinders(new ChildAssociatedNodeFinder(ContentModel.ASSOC_CONTAINS));
Set<NodeRef> crawledNodes = crawler.crawl(rootNode);
</pre>
* @see NodeCrawlerFactory
*
*/
public class ChildAssociatedNodeFinder extends AbstractNodeFinder
{

View File

@@ -36,6 +36,7 @@ import javax.xml.parsers.SAXParserFactory;
import org.alfresco.model.ContentModel;
import org.alfresco.repo.policy.BehaviourFilter;
import org.alfresco.repo.rule.RuleModel;
import org.alfresco.repo.security.authentication.AuthenticationUtil;
import org.alfresco.repo.security.authentication.AuthenticationUtil.RunAsWork;
import org.alfresco.repo.tenant.TenantService;
@@ -51,6 +52,7 @@ import org.alfresco.service.cmr.repository.DuplicateChildNodeNameException;
import org.alfresco.service.cmr.repository.NodeRef;
import org.alfresco.service.cmr.repository.NodeService;
import org.alfresco.service.cmr.repository.StoreRef;
import org.alfresco.service.cmr.rule.RuleService;
import org.alfresco.service.cmr.search.ResultSet;
import org.alfresco.service.cmr.search.SearchService;
import org.alfresco.service.cmr.transfer.TransferException;
@@ -142,6 +144,7 @@ public class RepoTransferReceiverImpl implements TransferReceiver
private TransferProgressMonitor progressMonitor;
private ActionService actionService;
private TenantService tenantService;
private RuleService ruleService;
private Map<String,NodeRef> transferLockFolderMap = new ConcurrentHashMap<String, NodeRef>();
private Map<String,NodeRef> transferTempFolderMap = new ConcurrentHashMap<String, NodeRef>();
@@ -151,6 +154,10 @@ public class RepoTransferReceiverImpl implements TransferReceiver
{
PropertyCheck.mandatory(this, "nodeService", nodeService);
PropertyCheck.mandatory(this, "searchService", searchService);
PropertyCheck.mandatory(this, "ruleService", ruleService);
PropertyCheck.mandatory(this, "actionService", actionService);
PropertyCheck.mandatory(this, "behaviourFilter", behaviourFilter);
PropertyCheck.mandatory(this, "tennantService", tenantService);
PropertyCheck.mandatory(this, "transactionService", transactionService);
PropertyCheck.mandatory(this, "transferLockFolderPath", transferLockFolderPath);
PropertyCheck.mandatory(this, "inboundTransferRecordsPath", inboundTransferRecordsPath);
@@ -587,6 +594,13 @@ public class RepoTransferReceiverImpl implements TransferReceiver
{
log.debug("Committing transferId=" + transferId);
}
/**
* Turn off rules while transfer is being committed.
*/
boolean rulesEnabled = ruleService.isEnabled();
ruleService.disableRules();
try
{
nudgeLock(transferId);
@@ -616,14 +630,18 @@ public class RepoTransferReceiverImpl implements TransferReceiver
for (TransferManifestProcessor processor : commitProcessors)
{
XMLTransferManifestReader reader = new XMLTransferManifestReader(processor);
behaviourFilter.disableBehaviour(ContentModel.ASPECT_AUDITABLE);
//behaviourFilter.disableBehaviour(ContentModel.ASPECT_AUDITABLE);
behaviourFilter.disableAllBehaviours();
try
{
parser.parse(snapshotFile, reader);
}
finally
{
behaviourFilter.enableBehaviour(ContentModel.ASPECT_AUDITABLE);
// behaviourFilter.enableBehaviour(ContentModel.ASPECT_AUDITABLE);
behaviourFilter.enableAllBehaviours();
}
nudgeLock(transferId);
parser.reset();
@@ -674,6 +692,14 @@ public class RepoTransferReceiverImpl implements TransferReceiver
}
finally
{
if(rulesEnabled)
{
/**
* Turn rules back on if we turned them off earlier.
*/
ruleService.enableRules();
}
/**
* Clean up at the end of the transfer
*/
@@ -805,5 +831,19 @@ public class RepoTransferReceiverImpl implements TransferReceiver
{
this.actionService = actionService;
}
/**
* @param progressMonitor
* the progressMonitor to set
*/
public void setRuleService(RuleService ruleService)
{
this.ruleService = ruleService;
}
public RuleService getRuleService()
{
return this.ruleService;
}
}

View File

@@ -39,6 +39,8 @@ import org.alfresco.service.cmr.repository.ContentData;
import org.alfresco.service.cmr.repository.MLText;
import org.alfresco.service.cmr.repository.NodeRef;
import org.alfresco.service.cmr.repository.Path;
import org.alfresco.service.cmr.repository.datatype.DefaultTypeConverter;
import org.alfresco.service.cmr.repository.datatype.TypeConversionException;
import org.alfresco.service.cmr.transfer.TransferException;
import org.alfresco.service.namespace.NamespaceException;
import org.alfresco.service.namespace.NamespacePrefixResolver;
@@ -270,6 +272,7 @@ public class XMLTransferManifestReader extends DefaultHandler implements Content
}
else if(elementName.equals(ManifestModel.LOCALNAME_ELEMENT_VALUE_STRING))
{
props.put("className", atts.getValue("", "className"));
buffer = new StringBuffer();
}
else if(elementName.equals(ManifestModel.LOCALNAME_ELEMENT_VALUE_NULL))
@@ -458,7 +461,26 @@ public class XMLTransferManifestReader extends DefaultHandler implements Content
else if(elementName.equals(ManifestModel.LOCALNAME_ELEMENT_VALUE_STRING))
{
Collection<Serializable> values = (Collection<Serializable>)props.get("values");
String value = buffer.toString();
String className = (String)props.get("className");
Serializable value = buffer.toString();
if(className != null && !className.equals("java.lang.String"))
{
// value is not a string and needs to be converted
try
{
value = (Serializable)DefaultTypeConverter.INSTANCE.convert(Class.forName(className), value);
}
catch (TypeConversionException tcf)
{
// leave value as string
}
catch (ClassNotFoundException cnf)
{
// leave value as string
}
}
if(values != null)
{

View File

@@ -255,12 +255,12 @@ public class XMLTransferManifestWriter implements TransferManifestWriter
}
@SuppressWarnings("unchecked")
private void writeProperty(QName name, Serializable value) throws SAXException
private void writeProperty(QName propertyName, Serializable value) throws SAXException
{
{
AttributesImpl attributes = new AttributesImpl();
attributes.addAttribute(TransferModel.TRANSFER_MODEL_1_0_URI, "name", "name", "String",
formatQName(name));
formatQName(propertyName));
writer.startElement(TransferModel.TRANSFER_MODEL_1_0_URI,
ManifestModel.LOCALNAME_ELEMENT_PROPERTY, PREFIX + ":"
+ ManifestModel.LOCALNAME_ELEMENT_PROPERTY, attributes);
@@ -328,11 +328,16 @@ public class XMLTransferManifestWriter implements TransferManifestWriter
{
try
{
AttributesImpl valueAttributes = new AttributesImpl();
valueAttributes.addAttribute(TransferModel.TRANSFER_MODEL_1_0_URI, "className",
"className", "String", value.getClass().getName());
String strValue = (String) DefaultTypeConverter.INSTANCE.convert(String.class, value);
writer.startElement(TransferModel.TRANSFER_MODEL_1_0_URI,
ManifestModel.LOCALNAME_ELEMENT_VALUE_STRING, PREFIX + ":"
+ ManifestModel.LOCALNAME_ELEMENT_VALUE_STRING,
EMPTY_ATTRIBUTES);
valueAttributes);
writer.characters(strValue.toCharArray(), 0, strValue.length());

View File

@@ -52,7 +52,7 @@ import org.springframework.extensions.surf.util.ISO8601DateFormat;
/**
* Support for generic conversion between types.
*
* Additional conversions may be added. Basic interoperabikitynos supported.
* Additional conversions may be added. Basic inter-operability supported.
*
* Direct conversion and two stage conversions via Number are supported. We do
* not support conversion by any route at the moment

View File

@@ -52,20 +52,28 @@ public interface RuleService
/**
* Enable rules for the current thread
*
* @see #isEnabled
* @see #disableRules
*/
@Auditable
public void enableRules();
/**
* Diable rules for the current thread
* Disable rules for the current thread
* @see #enableRules
* @see #isEnabled
*/
@Auditable
public void disableRules();
/**
* Indicates whether rules are currently enabled or not
* Indicates whether rules are currently enabled for the current thread or not
*
* @return true if rules are enabled, false otherwise
* @see #enableRules
* @see #disableRules
*
* @return true if rules are enabled for the current thread, false otherwise
*/
@Auditable
public boolean isEnabled();
@@ -101,6 +109,8 @@ public interface RuleService
/**
* Disables a rule, preventing it from being fired.
*
* @See enableRule
*
* @param rule the rule to disable
*/
@Auditable(parameters = {"rule"})
@@ -109,6 +119,8 @@ public interface RuleService
/**
* Enables a rule previously disabled.
*
* @See diableRule
*
* @param rule the rule to enable
*/
@Auditable(parameters = {"rule"})

View File

@@ -21,5 +21,12 @@ package org.alfresco.service.cmr.transfer;
public interface NodeCrawlerFactory
{
/**
* Get a node crawler from the node crawler factory.
*
* A new instance of a node crawler is returned each time this method is called.
*
* @return a new node crawler.
*/
NodeCrawler getNodeCrawler();
}

View File

@@ -0,0 +1,102 @@
/*
* Copyright (C) 2005-2010 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
*/
package org.alfresco.util.remote.server.socket;
import java.io.IOException;
import java.io.Serializable;
import java.net.InetAddress;
import java.net.ServerSocket;
import java.net.Socket;
import java.net.UnknownHostException;
import java.rmi.server.RMIClientSocketFactory;
import java.rmi.server.RMIServerSocketFactory;
import java.util.Properties;
import org.springframework.beans.factory.InitializingBean;
/**
* This <i><b>Spring</b> depended</i> class allows to control the binding of a RMI Registry to some port and concrete local host, e.g.: <code>localhost</code>,
* <code>192.168.0.1</code> etc. Host may be configured with the <code>-Djava.rmi.server.hostname</code> system property<br />
* <br />
* <i><b>NOTE:</b> The system property configuration has the highest priority</i>
*
* @author Dmitry Velichkevich
* @see InitializingBean <b>Spring</b> dependence
* @see RMIServerSocketFactory
* @see RMIClientSocketFactory
*/
public class HostConfigurableSocketFactory implements RMIServerSocketFactory, RMIClientSocketFactory, InitializingBean, Serializable
{
private static final long serialVersionUID = 4115227360496369889L;
private static final String SERVER_HOSTNAME_PROPERTY = "java.rmi.server.hostname";
private InetAddress host;
public void setHost(String host)
{
try
{
this.host = InetAddress.getByName(host);
}
catch (UnknownHostException e)
{
throw new RuntimeException(e.toString());
}
}
public void setHost(InetAddress host)
{
this.host = host;
}
/**
* @return {@link String} value which represents either a <i>Host Name</i> or a <i>Host (IP) Address</i> if <i>Host Name</i> is not reachable
*/
public String getHost()
{
if (null != host.getHostName())
{
return host.getHostName();
}
return host.getHostAddress();
}
public Socket createSocket(String host, int port) throws IOException
{
return new Socket(this.host, port);
}
public ServerSocket createServerSocket(int port) throws IOException
{
return new ServerSocket(port, 0, host);
}
/**
* Checks whether the -Djava.rmi.server.hostname system property presented and sets a host from this property if it is true
*/
public void afterPropertiesSet() throws Exception
{
Properties properties = System.getProperties();
if (properties.containsKey(SERVER_HOSTNAME_PROPERTY))
{
setHost(properties.getProperty(SERVER_HOSTNAME_PROPERTY));
}
}
}

View File

@@ -70,7 +70,7 @@ public class AbstractWCMServiceImplTest extends TestCase
protected static final boolean CLEAN = true; // cleanup during teardown
// base web project
protected static final String TEST_WEBPROJ_DNS = "testWebProj-"+TEST_RUN;
protected static final String TEST_WEBPROJ_DNS = "testWP-"+TEST_RUN;
protected static final String TEST_WEBPROJ_NAME = "testSandbox Web Project Display Name - "+TEST_RUN;
protected static final String TEST_WEBPROJ_TITLE = "This is my title";

View File

@@ -180,9 +180,9 @@ public class WebProjectServiceImplTest extends AbstractWCMServiceImplTest
}
// Mangled case
String dnsName = TEST_WEBPROJ_DNS+"some.unexpected.chars";
String dnsName = TEST_WEBPROJ_DNS+"-a.b.c";
String name = dnsName + " name";
String mangledDnsName = TEST_WEBPROJ_DNS+"some-unexpected-chars";
String mangledDnsName = TEST_WEBPROJ_DNS+"-a-b-c";
wpInfo = wpService.createWebProject(dnsName, name, TEST_WEBPROJ_TITLE, TEST_WEBPROJ_DESCRIPTION, TEST_WEBPROJ_DEFAULT_WEBAPP, TEST_WEBPROJ_USE_AS_TEMPLATE, null);
checkWebProjectInfo(wpInfo, mangledDnsName, name, TEST_WEBPROJ_TITLE, TEST_WEBPROJ_DESCRIPTION, TEST_WEBPROJ_DEFAULT_WEBAPP, TEST_WEBPROJ_USE_AS_TEMPLATE);
@@ -190,8 +190,8 @@ public class WebProjectServiceImplTest extends AbstractWCMServiceImplTest
checkWebProjectInfo(wpInfo, mangledDnsName, name, TEST_WEBPROJ_TITLE, TEST_WEBPROJ_DESCRIPTION, TEST_WEBPROJ_DEFAULT_WEBAPP, TEST_WEBPROJ_USE_AS_TEMPLATE);
// Another mangled case
dnsName = TEST_WEBPROJ_DNS+"some.moreé1í2ó3ú4Á5É6Í7Ó8Ú9";
mangledDnsName = TEST_WEBPROJ_DNS+"some-more-1-2-3-4-5-6-7-8-9";
dnsName = TEST_WEBPROJ_DNS+"-0é1í2ó3ú4";
mangledDnsName = TEST_WEBPROJ_DNS+"-0-1-2-3-4";
name = dnsName + " name";