Merged V4.0-BUG-FIX to HEAD

37207: BDE-69: Filter more tests for minimal build plan
   37253: Fix for ALF-13634 Re-created category won't show up again on a node in Document Library.
   - also fixes deletion of secondary associations
   37265: Merged V4.0 to V4.0-BUG-FIX
      37224: ALF-14174: Part 14 for ALF-14237 Upgrades from 4.0.0.x/4..0.1.0 will not fix the timestamps on acl changesets - SOLR will skip ACLs set prior to upgrade
      - Fix syntax error on Oracle
      37250: Fix for ALF-14174 The patch adding timestamps to acl_change_set breaks SOLR tracking
      - better cross DB fix
   37298:  ALF-14365 - added hazelcastConfig.xml.sample
   37323: ALF-13247: Two nodes with the same primary path. 
      -Fixed by initializing zone before parallel batch processing begins.
   37326: ALF-13933 Alfresco needs to be able to support LibreOffice for transformations
   ALF-13452 Open office startup from Java not working on OSX
      - Added code to start LibreOffice 3.5 on Mac (requires different options to the command and
        ure-link is a directory rather than a file on mac)
      - Removes $DYLD_LIBRARY_PATH from the environment when starting either openoffice or libreoffice on mac
        so does not need to rely on the installer moving the soffice.bin process to .soffice.bin and then
        creating a soffice.bin shell script that removed $DYLD_LIBRARY_PATH
      - Indent TransformerDebug a bit more now we have fail over transformers at the top and lower levels
        (saves N.N.N.N.N.N getting mixed up with text)
   37340: Merged V3.4-BUG-FIX (3.4.10) to V4.0-BUG-FIX (4.0.3) RECORD ONLY
      37339: ALF-13452: Merged V4.0-BUG-FIX (4.0.3) to V3.4-BUG-FIX (3.4.10)
         37326: ALF-13933 Alfresco needs to be able to support LibreOffice for transformations
         ALF-13452 Open office startup from Java not working on OSX
            - Added code to start LibreOffice 3.5 on Mac (requires different options to the command and
              ure-link is a directory rather than a file on mac)
            - Removes $DYLD_LIBRARY_PATH from the environment when starting either openoffice or libreoffice on mac
              so does not need to rely on the installer moving the soffice.bin process to .soffice.bin and then
              creating a soffice.bin shell script that removed $DYLD_LIBRARY_PATH
            - Indent TransformerDebug a bit more now we have fail over transformers at the top and lower levels
              (saves N.N.N.N.N.N getting mixed up with text)
         36273: ALF-13933 Alfresco needs to be able to support LibreOffice for transformations
            - Return a dummy OpenOffice command even when there is no OpenOffice/LibreOffice installed or on the path. 
         36264: ALF-13933 Alfresco needs to be able to support LibreOffice for transformations
            - remove old jodconverter-core-3.0-beta-3.diff
         36259: ALF-13933 Alfresco needs to be able to support LibreOffice for transformations
            << Developed on Windows 7. Might need more work on Linux to get LibreOffice to shut down, but should be
               okay with OpenOffice 3.2 which was used in the previous release. >> 
            - Updated jodconverter to latest version jodconverter-core-3.0-SNAPSHOT-patched.jar 28/4/2012 which is newer
              than 3.0 beta-4
            - Applied patch for http://code.google.com/p/jodconverter/issues/detail?id=103 to handle setting the env
              for LibreOffice 3.5
            - Modified code to use partial GNU style options (not used for -env!) when using LibreOffice
            - Added OpenOfficeCommandLine to dynamically supply OpenOffice or LibreOffice command line args for OOoDirect
            - Tested to work with OpenOffice 3.4 and 3.2 on Windows 7
   37353: Merged V3.4-BUG-FIX (3.4.10) to V4.0-BUG-FIX (4.0.3)
      37352: ALF-13452, ALF-13933 Alfresco needs to be able to support LibreOffice for transformations
         - Build test failure
   37359: New JUnit Rule to support automatic creation and cleanup of Share sites in test code.
   This is required for an imminent fix to ALF-14345, but I'm checking it in separately in order to merge this general utility.
   37360: Fix for ALF-14345. Site Service list method does not recognise sub-types of st:site.
   37364: Merged V3.4-BUG-FIX (3.4.10) to V4.0-BUG-FIX (4.0.3) RECORD ONLY (not needed in 4.0.x)
      37363: ALF-13452, ALF-13933 Alfresco needs to be able to support LibreOffice for transformations
         - Build test failure x2 (reference to jodconverter*jar not needed in 4.0.x)
   37370: Merged V3.4-BUG-FIX:
      ALF-11714: Updated WCMQS to ensure all FreeMarker variables output to HTML are protected with ?html to prevent XSS
   37382: Translation (DE, IT, JA, NL) updates from Gloria, based on EN rev37081
   37384: Fix for ALF-14219 SolrQueryHTTPClient unable to handle long queries (4096 bytes)
   37386: Merged V4.0 to V4.0-BUG-FIX
      37385: ALF-14238: Fix by Dmitry to correct iteration in ImapUnsubscribedAspectPatch


git-svn-id: https://svn.alfresco.com/repos/alfresco-enterprise/alfresco/HEAD/root@37387 c4b6b30b-aa2e-2d43-bbcb-ca4b014f7261
This commit is contained in:
Dave Ward
2012-06-02 07:56:08 +00:00
parent 8e1e570c3c
commit 507c4d8bf8
24 changed files with 1038 additions and 172 deletions

View File

@@ -8,13 +8,17 @@
--
-- Migrate data
--ASSIGN:min_tx_ms=min_tx_ms
SELECT min(commit_time_ms) as min_tx_ms from alf_transaction;
--FOREACH alf_acl_change_set.id system.upgrade.alf_acl_change_set.batchsize
UPDATE alf_acl_change_set
SET
commit_time_ms = ((select min(t.commit_time_ms) from alf_transaction t) + id)
commit_time_ms = ${min_tx_ms} + id
WHERE
id >= ${LOWERBOUND} AND id <= ${UPPERBOUND}
AND commit_time_ms < (select min(t.commit_time_ms) from alf_transaction t)
AND commit_time_ms < ${min_tx_ms}
;

View File

@@ -0,0 +1,192 @@
<?xml version="1.0" encoding="UTF-8"?>
<hazelcast xsi:schemaLocation="http://www.hazelcast.com/schema/config hazelcast-basic.xsd"
xmlns="http://www.hazelcast.com/schema/config"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<group>
<name>dev</name>
<password>dev-pass</password>
</group>
<network>
<port auto-increment="true">5701</port>
<join>
<multicast enabled="true">
<multicast-group>224.2.2.3</multicast-group>
<multicast-port>54327</multicast-port>
</multicast>
<tcp-ip enabled="false">
<interface>127.0.0.1</interface>
</tcp-ip>
</join>
<interfaces enabled="true">
<interface>192.168.56.1</interface>
</interfaces>
<symmetric-encryption enabled="false">
<!--
encryption algorithm such as
DES/ECB/PKCS5Padding,
PBEWithMD5AndDES,
AES/CBC/PKCS5Padding,
Blowfish,
DESede
-->
<algorithm>PBEWithMD5AndDES</algorithm>
<!-- salt value to use when generating the secret key -->
<salt>thesalt</salt>
<!-- pass phrase to use when generating the secret key -->
<password>thepass</password>
<!-- iteration count to use when generating the secret key -->
<iteration-count>19</iteration-count>
</symmetric-encryption>
<asymmetric-encryption enabled="false">
<!-- encryption algorithm -->
<algorithm>RSA/NONE/PKCS1PADDING</algorithm>
<!-- private key password -->
<keyPassword>thekeypass</keyPassword>
<!-- private key alias -->
<keyAlias>local</keyAlias>
<!-- key store type -->
<storeType>JKS</storeType>
<!-- key store password -->
<storePassword>thestorepass</storePassword>
<!-- path to the key store -->
<storePath>keystore</storePath>
</asymmetric-encryption>
</network>
<executor-service>
<core-pool-size>16</core-pool-size>
<max-pool-size>64</max-pool-size>
<keep-alive-seconds>60</keep-alive-seconds>
</executor-service>
<queue name="default">
<!--
Maximum size of the queue. When a JVM's local queue size reaches the maximum,
all put/offer operations will get blocked until the queue size
of the JVM goes down below the maximum.
Any integer between 0 and Integer.MAX_VALUE. 0 means
Integer.MAX_VALUE. Default is 0.
-->
<max-size-per-jvm>0</max-size-per-jvm>
<!--
Maximum number of seconds for each item to stay in the queue. Items that are
not consumed in <time-to-live-seconds> will automatically
get evicted from the queue.
Any integer between 0 and Integer.MAX_VALUE. 0 means
infinite. Default is 0.
-->
<time-to-live-seconds>0</time-to-live-seconds>
</queue>
<map name="default">
<!--
Number of backups. If 1 is set as the backup-count for example,
then all entries of the map will be copied to another JVM for
fail-safety. Valid numbers are 0 (no backup), 1, 2, 3.
-->
<backup-count>1</backup-count>
<!--
Valid values are:
NONE (no eviction),
LRU (Least Recently Used),
LFU (Least Frequently Used).
NONE is the default.
-->
<eviction-policy>NONE</eviction-policy>
<!--
Maximum size of the map. When max size is reached,
map is evicted based on the policy defined.
Any integer between 0 and Integer.MAX_VALUE. 0 means
Integer.MAX_VALUE. Default is 0.
-->
<max-size>0</max-size>
<!--
When max. size is reached, specified percentage of
the map will be evicted. Any integer between 0 and 100.
If 25 is set for example, 25% of the entries will
get evicted.
-->
<eviction-percentage>25</eviction-percentage>
<!--
While recovering from split-brain (network partitioning),
map entries in the small cluster will merge into the bigger cluster
based on the policy set here. When an entry merge into the
cluster, there might an existing entry with the same key already.
Values of these entries might be different for that same key.
Which value should be set for the key? Conflict is resolved by
the policy set here. Default policy is hz.ADD_NEW_ENTRY
There are built-in merge policies such as
hz.NO_MERGE ; no entry will merge.
hz.ADD_NEW_ENTRY ; entry will be added if the merging entry's key
doesn't exist in the cluster.
hz.HIGHER_HITS ; entry with the higher hits wins.
hz.LATEST_UPDATE ; entry with the latest update wins.
-->
<merge-policy>hz.ADD_NEW_ENTRY</merge-policy>
</map>
<!-- Add your own map merge policy implementations here:
<merge-policies>
<map-merge-policy name="MY_MERGE_POLICY">
<class-name>com.acme.MyOwnMergePolicy</class-name>
</map-merge-policy>
</merge-policies>
-->
<map name="AlfrescoFilesysCache">
<!--
Number of backups. If 1 is set as the backup-count for example,
then all entries of the map will be copied to another JVM for
fail-safety. Valid numbers are 0 (no backup), 1, 2, 3.
-->
<backup-count>1</backup-count>
<!--
Valid values are:
NONE (no eviction),
LRU (Least Recently Used),
LFU (Least Frequently Used).
NONE is the default.
-->
<eviction-policy>NONE</eviction-policy>
<!--
Maximum size of the map. When max size is reached,
map is evicted based on the policy defined.
Any integer between 0 and Integer.MAX_VALUE. 0 means
Integer.MAX_VALUE. Default is 0.
-->
<max-size>0</max-size>
<!--
When max. size is reached, specified percentage of
the map will be evicted. Any integer between 0 and 100.
If 25 is set for example, 25% of the entries will
get evicted.
-->
<eviction-percentage>25</eviction-percentage>
<!--
While recovering from split-brain (network partitioning),
map entries in the small cluster will merge into the bigger cluster
based on the policy set here. When an entry merge into the
cluster, there might an existing entry with the same key already.
Values of these entries might be different for that same key.
Which value should be set for the key? Conflict is resolved by
the policy set here. Default policy is hz.ADD_NEW_ENTRY
There are built-in merge policies such as
hz.NO_MERGE ; no entry will merge.
hz.ADD_NEW_ENTRY ; entry will be added if the merging entry's key
doesn't exist in the cluster.
hz.HIGHER_HITS ; entry with the higher hits wins.
hz.LATEST_UPDATE ; entry with the latest update wins.
-->
<merge-policy>hz.ADD_NEW_ENTRY</merge-policy>
<!--
<near-cache>
<time-to-live-seconds>5</time-to-live-seconds>
<max-idle-seconds>60</max-idle-seconds>
<eviction-policy>LRU</eviction-policy>
<max-size>1000</max-size>
<invalidate-on-change>true</invalidate-on-change>
</near-cache>
-->
</map>
</hazelcast>

View File

@@ -161,6 +161,8 @@ patch.wcmFolders.webprojects.result.created=The Web Projects folder was successf
patch.wcmFolders.webforms.result.exists=The Web Forms folder already exists: {0}
patch.wcmFolders.webforms.result.created=The Web Forms folder was successfully created: {0}
patch.wcmDeployed.description=Adds the 'WCM Deployed' space to the company home folder.
patch.linkNodeExtension.description=Fixes link node file extensions to have a .url extension.
patch.linkNodeExtension.result=Fixed {0} link node file extensions. See file {1} for details.
patch.linkNodeExtension.err.unable_to_fix=Auto-fixing of link node file extensions failed. See file {0} for details.
@@ -470,3 +472,8 @@ patch.migrateTenantsFromAttrsToTable.result=Processed {0} tenants
patch.remoteCredentialsContainer.description=Patch to add the root folder for Shared Remote Credentials
patch.syncSetDefinitionsContainer.description=Patch to add the root folder for SyncSet Definitions
patch.swsdpPatch.description=Patch to fix up the Sample: Web Site Design Project.
patch.swsdpPatch.success=Successfully patched the Sample: Web Site Design Project.
patch.swsdpPatch.skipped=Skipped, not required.
patch.swsdpPatch.missingSurfConfig=surf-config folder is not present in Sample: Web Site Design Project.

View File

@@ -161,6 +161,8 @@ patch.wcmFolders.webprojects.result.created=The Web Projects folder was successf
patch.wcmFolders.webforms.result.exists=The Web Forms folder already exists: {0}
patch.wcmFolders.webforms.result.created=The Web Forms folder was successfully created: {0}
patch.wcmDeployed.description=Adds the 'WCM Deployed' space to the company home folder.
patch.linkNodeExtension.description=Fixes link node file extensions to have a .url extension.
patch.linkNodeExtension.result=Fixed {0} link node file extensions. See file {1} for details.
patch.linkNodeExtension.err.unable_to_fix=Auto-fixing of link node file extensions failed. See file {0} for details.
@@ -470,3 +472,8 @@ patch.migrateTenantsFromAttrsToTable.result=Processed {0} tenants
patch.remoteCredentialsContainer.description=Patch to add the root folder for Shared Remote Credentials
patch.syncSetDefinitionsContainer.description=Patch to add the root folder for SyncSet Definitions
patch.swsdpPatch.description=Patch to fix up the Sample: Web Site Design Project.
patch.swsdpPatch.success=Successfully patched the Sample: Web Site Design Project.
patch.swsdpPatch.skipped=Skipped, not required.
patch.swsdpPatch.missingSurfConfig=surf-config folder is not present in Sample: Web Site Design Project.

View File

@@ -161,6 +161,8 @@ patch.wcmFolders.webprojects.result.created=The Web Projects folder was successf
patch.wcmFolders.webforms.result.exists=The Web Forms folder already exists: {0}
patch.wcmFolders.webforms.result.created=The Web Forms folder was successfully created: {0}
patch.wcmDeployed.description=Adds the 'WCM Deployed' space to the company home folder.
patch.linkNodeExtension.description=Fixes link node file extensions to have a .url extension.
patch.linkNodeExtension.result=Fixed {0} link node file extensions. See file {1} for details.
patch.linkNodeExtension.err.unable_to_fix=Auto-fixing of link node file extensions failed. See file {0} for details.
@@ -470,3 +472,8 @@ patch.migrateTenantsFromAttrsToTable.result=Processed {0} tenants
patch.remoteCredentialsContainer.description=Patch to add the root folder for Shared Remote Credentials
patch.syncSetDefinitionsContainer.description=Patch to add the root folder for SyncSet Definitions
patch.swsdpPatch.description=Patch to fix up the Sample: Web Site Design Project.
patch.swsdpPatch.success=Successfully patched the Sample: Web Site Design Project.
patch.swsdpPatch.skipped=Skipped, not required.
patch.swsdpPatch.missingSurfConfig=surf-config folder is not present in Sample: Web Site Design Project.

View File

@@ -161,6 +161,8 @@ patch.wcmFolders.webprojects.result.created=The Web Projects folder was successf
patch.wcmFolders.webforms.result.exists=The Web Forms folder already exists: {0}
patch.wcmFolders.webforms.result.created=The Web Forms folder was successfully created: {0}
patch.wcmDeployed.description=Adds the 'WCM Deployed' space to the company home folder.
patch.linkNodeExtension.description=Fixes link node file extensions to have a .url extension.
patch.linkNodeExtension.result=Fixed {0} link node file extensions. See file {1} for details.
patch.linkNodeExtension.err.unable_to_fix=Auto-fixing of link node file extensions failed. See file {0} for details.
@@ -470,3 +472,8 @@ patch.migrateTenantsFromAttrsToTable.result=Processed {0} tenants
patch.remoteCredentialsContainer.description=Patch to add the root folder for Shared Remote Credentials
patch.syncSetDefinitionsContainer.description=Patch to add the root folder for SyncSet Definitions
patch.swsdpPatch.description=Patch to fix up the Sample: Web Site Design Project.
patch.swsdpPatch.success=Successfully patched the Sample: Web Site Design Project.
patch.swsdpPatch.skipped=Skipped, not required.
patch.swsdpPatch.missingSurfConfig=surf-config folder is not present in Sample: Web Site Design Project.

View File

@@ -3066,6 +3066,9 @@
<property name="nodeDAO">
<ref bean="nodeDAO"/>
</property>
<property name="patchDAO">
<ref bean="patchDAO"/>
</property>
<property name="personService">
<ref bean="personService" />
</property>

View File

@@ -36,6 +36,13 @@
<property name="errorCodes">
<value>2</value>
</property>
<property name="processProperties">
<bean class="org.alfresco.util.OpenOfficeCommandEnv">
<constructor-arg>
<value>${ooo.exe}</value>
</constructor-arg>
</bean>
</property>
</bean>
<bean id="openOfficeConnection" class="net.sf.jooreports.openoffice.connection.SocketOpenOfficeConnection">

View File

@@ -21,9 +21,7 @@ package org.alfresco.repo.admin.patch.impl;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.alfresco.model.ContentModel;
import org.alfresco.model.ImapModel;
@@ -33,6 +31,7 @@ import org.alfresco.repo.batch.BatchProcessor;
import org.alfresco.repo.batch.BatchProcessor.BatchProcessWorker;
import org.alfresco.repo.domain.node.NodeDAO;
import org.alfresco.repo.domain.node.NodeDAO.NodeRefQueryCallback;
import org.alfresco.repo.domain.patch.PatchDAO;
import org.alfresco.service.cmr.repository.ChildAssociationRef;
import org.alfresco.service.cmr.repository.NodeRef;
import org.alfresco.service.cmr.security.PersonService;
@@ -45,27 +44,21 @@ public class ImapUnsubscribedAspectPatch extends AbstractPatch
{
private static final String MSG_NONSUBSCRIBED_ASPECT_REMOVED = "patch.imapUnsubscribedAspect.result.removed";
private static final QName ASPECT_NON_SUBSCRIBED = QName.createQName("{http://www.alfresco.org/model/imap/1.0}nonSubscribed");
private static final String PROP_MIN_ID = "minNodeId";
private NodeDAO nodeDAO;
private PatchDAO patchDAO;
private PersonService personService;
private final Map<String, Long> properties = new HashMap<String, Long>();
private final int batchThreads = 3;
private final int batchSize = 40;
private final long count = batchThreads * batchSize;
private long minSearchNodeId = 1;
private int batchThreads = 3;
private int batchSize = 40;
private long count = batchThreads * batchSize;
@Override
public void init()
{
super.init();
properties.put(PROP_MIN_ID, 1L);
}
@Override
protected String applyInternal() throws Exception
{
final List<ChildAssociationRef> users = nodeService.getChildAssocs(personService.getPeopleContainer(), ContentModel.ASSOC_CHILDREN, RegexQNamePattern.MATCH_ALL);
final long maxNodeId = patchDAO.getMaxAdmNodeID();
BatchProcessWorkProvider<NodeRef> workProvider = new BatchProcessWorkProvider<NodeRef>()
{
@@ -79,17 +72,21 @@ public class ImapUnsubscribedAspectPatch extends AbstractPatch
public Collection<NodeRef> getNextWork()
{
result.clear();
nodeDAO.getNodesWithAspects(Collections.singleton(ASPECT_NON_SUBSCRIBED), properties.get(PROP_MIN_ID), count, new NodeRefQueryCallback()
while (result.isEmpty() && minSearchNodeId < maxNodeId)
{
nodeDAO.getNodesWithAspects(Collections.singleton(ASPECT_NON_SUBSCRIBED), minSearchNodeId,
minSearchNodeId + count, new NodeRefQueryCallback()
{
public boolean handle(Pair<Long, NodeRef> nodePair)
{
properties.put(PROP_MIN_ID, nodePair.getFirst());
result.add(nodePair.getSecond());
return true;
}
public boolean handle(Pair<Long, NodeRef> nodePair)
{
result.add(nodePair.getSecond());
return true;
}
});
});
minSearchNodeId = minSearchNodeId + count + 1;
}
return result;
}
@@ -137,6 +134,11 @@ public class ImapUnsubscribedAspectPatch extends AbstractPatch
this.nodeDAO = nodeDAO;
}
public void setPatchDAO(PatchDAO patchDAO)
{
this.patchDAO = patchDAO;
}
public void setPersonService(PersonService personService)
{
this.personService = personService;

View File

@@ -613,7 +613,7 @@ public class TransformerDebug
}
if (frame != null)
{
sb.append(spaces(9-sb.length()+lengthOfFirstId)); // Try to pad to level 5
sb.append(spaces(11-sb.length()+lengthOfFirstId)); // Try to pad to level 7
}
return sb.toString();
}

View File

@@ -940,6 +940,20 @@ public abstract class AbstractNodeDAOImpl implements NodeDAO, BatchingDAO
return nodePair.getSecond().getNodeStatus();
}
}
public Status getNodeIdStatus(Long nodeId)
{
Pair<Long, Node> nodePair = nodesCache.getByKey(nodeId);
// The nodesCache gets both live and deleted nodes.
if (nodePair == null)
{
return null;
}
else
{
return nodePair.getSecond().getNodeStatus();
}
}
public Pair<Long, NodeRef> getNodePair(NodeRef nodeRef)
{

View File

@@ -146,6 +146,16 @@ public interface NodeDAO extends NodeBulkLoader
*/
public NodeRef.Status getNodeRefStatus(NodeRef nodeRef);
/**
* Get the current status of the node, including deleted nodes.
*
* @param nodeId the node id
* @return Returns the current status of the reference.
* This will only be <tt>null</tt> if the node never existed or has been
* purged following deletion.
*/
public NodeRef.Status getNodeIdStatus(Long nodeId);
public Pair<Long, NodeRef> getNodePair(NodeRef nodeRef);
public Pair<Long, NodeRef> getNodePair(Long nodeId);

View File

@@ -188,11 +188,10 @@ public class SolrQueryHTTPClient implements BeanFactoryAware
}
url.append("/").append(languageUrlFragment);
// duplicate the query in the URL
url.append("?q=");
url.append(encoder.encode(searchParameters.getQuery(), "UTF-8"));
url.append("&wt=").append(encoder.encode("json", "UTF-8"));
// Send the query in JSON only
// url.append("?q=");
// url.append(encoder.encode(searchParameters.getQuery(), "UTF-8"));
url.append("?wt=").append(encoder.encode("json", "UTF-8"));
url.append("&fl=").append(encoder.encode("DBID,score", "UTF-8"));
if (searchParameters.getMaxItems() >= 0)

View File

@@ -760,6 +760,17 @@ public class ChainingUserRegistrySynchronizer extends AbstractLifecycleBean impl
// Create a prefixed zone ID for use with the authority service
final String zoneId = AuthorityService.ZONE_AUTH_EXT_PREFIX + zone;
// Ensure that the zoneId exists before multiple threads start using it
this.transactionService.getRetryingTransactionHelper().doInTransaction(new RetryingTransactionCallback<Void>()
{
@Override
public Void execute() throws Throwable
{
authorityService.getOrCreateZone(zoneId);
return null;
}
}, false, splitTxns);
// The set of zones we associate with new objects (default plus registry specific)
final Set<String> zoneSet = getZones(zoneId);
@@ -1856,7 +1867,7 @@ public class ChainingUserRegistrySynchronizer extends AbstractLifecycleBean impl
* the zone id
* @return the zone set
*/
private Set<String> getZones(String zoneId)
private Set<String> getZones(final String zoneId)
{
Set<String> zones = new HashSet<String>(5);
zones.add(AuthorityService.ZONE_APP_DEFAULT);

View File

@@ -922,7 +922,8 @@ public class SiteServiceImpl extends AbstractLifecycleBean implements SiteServic
// Only search for "st:site" nodes.
final Set<QName> searchTypeQNames = new HashSet<QName>(1);
searchTypeQNames.add(SiteModel.TYPE_SITE);
// searchTypeQNames.addAll(dictionaryService.getSubTypes(SiteModel.TYPE_SITE, true));
// ... and all subtypes of st:site
searchTypeQNames.addAll(dictionaryService.getSubTypes(SiteModel.TYPE_SITE, true));
// get canned query
final String cQBeanName = "siteGetChildrenCannedQueryFactory";

View File

@@ -0,0 +1,121 @@
/*
* Copyright (C) 2005-2012 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
*/
package org.alfresco.repo.site;
import static org.junit.Assert.assertNotNull;
import java.util.HashMap;
import java.util.Map;
import org.alfresco.query.PagingRequest;
import org.alfresco.query.PagingResults;
import org.alfresco.repo.security.authentication.AuthenticationUtil;
import org.alfresco.repo.transaction.RetryingTransactionHelper;
import org.alfresco.repo.transaction.RetryingTransactionHelper.RetryingTransactionCallback;
import org.alfresco.service.cmr.site.SiteInfo;
import org.alfresco.service.cmr.site.SiteService;
import org.alfresco.service.cmr.site.SiteVisibility;
import org.alfresco.service.namespace.NamespaceService;
import org.alfresco.service.namespace.QName;
import org.alfresco.util.test.junitrules.ApplicationContextInit;
import org.alfresco.util.test.junitrules.RunAsFullyAuthenticatedRule;
import org.alfresco.util.test.junitrules.TemporarySites;
import org.alfresco.util.test.junitrules.TemporarySitesTest;
import org.junit.BeforeClass;
import org.junit.ClassRule;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.RuleChain;
import org.springframework.extensions.webscripts.GUID;
/**
* This class contains some tests for the {@link SiteServiceImpl} - in addition to those already
* included in {@link SiteServiceImplTest}. This uses JUnit 4 annotations and JUnit Rules.
*
* TODO Refactor the two classes together into one common approach.
*
* @author Neil Mc Erlean
* @since 4.0.3
*/
public class SiteServiceImplMoreTest
{
// Rule to initialise the default Alfresco spring configuration
public static ApplicationContextInit APP_CONTEXT_INIT = ApplicationContextInit.createStandardContextWithOverrides("classpath:sites/test-"
+ TemporarySitesTest.class.getSimpleName() + "-context.xml");
// A rule to manage test nodes reused across all the test methods
public static TemporarySites STATIC_TEST_SITES = new TemporarySites(APP_CONTEXT_INIT);
// Tie them together in a static Rule Chain
@ClassRule public static RuleChain ruleChain = RuleChain.outerRule(APP_CONTEXT_INIT)
.around(STATIC_TEST_SITES);
@Rule public RunAsFullyAuthenticatedRule runAllTestsAsAdmin = new RunAsFullyAuthenticatedRule(AuthenticationUtil.getAdminUserName());
// Various services
private static NamespaceService NAMESPACE_SERVICE;
private static SiteService SITE_SERVICE;
private static RetryingTransactionHelper TRANSACTION_HELPER;
private static String TEST_SITE_NAME, TEST_SUB_SITE_NAME;
@BeforeClass public static void initStaticData() throws Exception
{
NAMESPACE_SERVICE = APP_CONTEXT_INIT.getApplicationContext().getBean("namespaceService", NamespaceService.class);
SITE_SERVICE = APP_CONTEXT_INIT.getApplicationContext().getBean("siteService", SiteService.class);
TRANSACTION_HELPER = APP_CONTEXT_INIT.getApplicationContext().getBean("retryingTransactionHelper", RetryingTransactionHelper.class);
// We'll create this test content as admin.
final String admin = AuthenticationUtil.getAdminUserName();
TEST_SITE_NAME = GUID.generate();
TEST_SUB_SITE_NAME = GUID.generate();
final QName subSiteType = QName.createQName("testsite", "testSubsite", NAMESPACE_SERVICE);
STATIC_TEST_SITES.createSite("sitePreset", TEST_SITE_NAME, "siteTitle", "siteDescription", SiteVisibility.PUBLIC, admin);
STATIC_TEST_SITES.createSite("sitePreset", TEST_SUB_SITE_NAME, "siteTitle", "siteDescription", SiteVisibility.PUBLIC, subSiteType, admin);
}
/**
* This method ensures that {@link SiteService#listSites(String)} includes content subtypes of {@link SiteModel#TYPE_SITE st:site}.
*/
@Test public void listSitesIncludingSubTypesOfSite() throws Exception
{
TRANSACTION_HELPER.doInTransaction(new RetryingTransactionCallback<Void>()
{
public Void execute() throws Throwable
{
PagingResults<SiteInfo> sites = SITE_SERVICE.listSites(null, null, new PagingRequest(0, 1024));
Map<String, SiteInfo> sitesByName = new HashMap<String, SiteInfo>();
for (SiteInfo site : sites.getPage())
{
sitesByName.put(site.getShortName(), site);
}
assertNotNull("st:site missing.", sitesByName.get(TEST_SITE_NAME));
assertNotNull("subtype of st:site missing.", sitesByName.get(TEST_SUB_SITE_NAME));
return null;
}
});
}
}

View File

@@ -52,6 +52,7 @@ import org.alfresco.service.cmr.dictionary.TypeDefinition;
import org.alfresco.service.cmr.repository.ChildAssociationRef;
import org.alfresco.service.cmr.repository.InvalidNodeRefException;
import org.alfresco.service.cmr.repository.NodeRef;
import org.alfresco.service.cmr.repository.NodeRef.Status;
import org.alfresco.service.cmr.repository.Path;
import org.alfresco.service.cmr.repository.datatype.DefaultTypeConverter;
import org.alfresco.service.cmr.security.OwnableService;
@@ -381,9 +382,40 @@ public class SOLRTrackingComponentImpl implements SOLRTrackingComponent
return false;
}
private Collection<Pair<Path, QName>> getCategoryPaths(NodeRef nodeRef, Set<QName> aspects, Map<QName, Serializable> properties)
static class CategoryPaths
{
Collection<Pair<Path, QName>> paths;
List<ChildAssociationRef> categoryParents;
CategoryPaths( Collection<Pair<Path, QName>> paths, List<ChildAssociationRef> categoryParents)
{
this.paths = paths;
this.categoryParents = categoryParents;
}
/**
* @return the paths
*/
public Collection<Pair<Path, QName>> getPaths()
{
return paths;
}
/**
* @return the categoryParents
*/
public List<ChildAssociationRef> getCategoryParents()
{
return categoryParents;
}
}
private CategoryPaths getCategoryPaths(NodeRef nodeRef, Set<QName> aspects, Map<QName, Serializable> properties)
{
ArrayList<Pair<Path, QName>> categoryPaths = new ArrayList<Pair<Path, QName>>();
ArrayList<ChildAssociationRef> categoryParents = new ArrayList<ChildAssociationRef>();
nodeDAO.setCheckNodeConsistency();
for (QName classRef : aspects)
@@ -439,13 +471,16 @@ public class SOLRTrackingComponentImpl implements SOLRTrackingComponent
{
Path.ChildAssocElement cae = (Path.ChildAssocElement) pair.getFirst().last();
ChildAssociationRef assocRef = cae.getRef();
pair.getFirst().append(new Path.ChildAssocElement(new ChildAssociationRef(assocRef.getTypeQName(), assocRef.getChildRef(), QName.createQName("member"), nodeRef)));
ChildAssociationRef categoryParentRef = new ChildAssociationRef(assocRef.getTypeQName(), assocRef.getChildRef(), QName.createQName("member"), nodeRef);
pair.getFirst().append(new Path.ChildAssocElement(categoryParentRef));
categoryParents.add(categoryParentRef);
}
}
return categoryPaths;
return new CategoryPaths(categoryPaths, categoryParents);
}
private List<Long> preCacheNodes(NodeMetaDataParameters nodeMetaDataParameters)
{
nodeDAO.setCheckNodeConsistency();
@@ -536,26 +571,33 @@ public class SOLRTrackingComponentImpl implements SOLRTrackingComponent
for(Long nodeId : nodeIds)
{
Map<QName, Serializable> props = null;
Set<QName> aspects = null;
if (!nodeDAO.exists(nodeId))
{
// Deleted nodes have no metadata
continue;
}
Status status = nodeDAO.getNodeIdStatus(nodeId);
NodeRef nodeRef = status.getNodeRef();
NodeMetaData nodeMetaData = new NodeMetaData();
nodeMetaData.setNodeId(nodeId);
Pair<Long, NodeRef> pair = nodeDAO.getNodePair(nodeId);
nodeMetaData.setAclId(nodeDAO.getNodeAclId(nodeId));
if(includeNodeRef)
{
nodeMetaData.setNodeRef(tenantService.getBaseName(nodeRef, true));
}
if(includeTxnId)
{
nodeMetaData.setTxnId(nodeDAO.getNodeRefStatus(pair.getSecond()).getDbTxnId());
nodeMetaData.setTxnId(status.getDbTxnId());
}
if(status.isDeleted())
{
rowHandler.processResult(nodeMetaData);
continue;
}
Map<QName, Serializable> props = null;
Set<QName> aspects = null;
nodeMetaData.setAclId(nodeDAO.getNodeAclId(nodeId));
if(includeType)
{
QName nodeType = nodeDAO.getNodeType(nodeId);
@@ -572,7 +614,10 @@ public class SOLRTrackingComponentImpl implements SOLRTrackingComponent
if(includeProperties)
{
props = getProperties(nodeId);
if(props == null)
{
props = getProperties(nodeId);
}
nodeMetaData.setProperties(props);
}
else
@@ -580,7 +625,7 @@ public class SOLRTrackingComponentImpl implements SOLRTrackingComponent
nodeMetaData.setProperties(Collections.<QName, Serializable>emptyMap());
}
if(includeAspects)
if(includeAspects || includePaths || includeParentAssociations)
{
aspects = new HashSet<QName>();
Set<QName> sourceAspects = nodeDAO.getNodeAspects(nodeId);
@@ -595,35 +640,38 @@ public class SOLRTrackingComponentImpl implements SOLRTrackingComponent
}
nodeMetaData.setAspects(aspects);
CategoryPaths categoryPaths = new CategoryPaths(new ArrayList<Pair<Path, QName>>(), new ArrayList<ChildAssociationRef>());
if(includePaths || includeParentAssociations)
{
if(props == null)
{
props = getProperties(nodeId);
}
categoryPaths = getCategoryPaths(status.getNodeRef(), aspects, props);
}
if(includePaths)
{
if(props == null)
{
props = getProperties(nodeId);
}
Collection<Pair<Path, QName>> categoryPaths = getCategoryPaths(pair.getSecond(), aspects, props);
List<Path> directPaths = nodeDAO.getPaths(pair, false);
Collection<Pair<Path, QName>> paths = new ArrayList<Pair<Path, QName>>(directPaths.size() + categoryPaths.size());
List<Path> directPaths = nodeDAO.getPaths(new Pair<Long, NodeRef>(nodeId, status.getNodeRef()), false);
Collection<Pair<Path, QName>> paths = new ArrayList<Pair<Path, QName>>(directPaths.size() + categoryPaths.getPaths().size());
for (Path path : directPaths)
{
paths.add(new Pair<Path, QName>(path.getBaseNamePath(tenantService), null));
}
for(Pair<Path, QName> catPair : categoryPaths)
for(Pair<Path, QName> catPair : categoryPaths.getPaths())
{
paths.add(new Pair<Path, QName>(catPair.getFirst().getBaseNamePath(tenantService), catPair.getSecond()));
}
nodeMetaData.setPaths(paths);
}
NodeRef nodeRef = pair.getSecond();
if(includeNodeRef)
{
nodeMetaData.setNodeRef(tenantService.getBaseName(nodeRef, true));
}
nodeMetaData.setTenantDomain(tenantService.getDomain(nodeRef.getStoreRef().getIdentifier()));
if(includeChildAssociations)
@@ -722,6 +770,10 @@ public class SOLRTrackingComponentImpl implements SOLRTrackingComponent
{
}
});
for(ChildAssociationRef ref : categoryPaths.getCategoryParents())
{
parentAssocs.add(tenantService.getBaseName(ref, true));
}
CRC32 crc = new CRC32();
for(ChildAssociationRef car : parentAssocs)
@@ -747,7 +799,7 @@ public class SOLRTrackingComponentImpl implements SOLRTrackingComponent
if(includeOwner)
{
// cached in OwnableService
nodeMetaData.setOwner(ownableService.getOwner(pair.getSecond()));
nodeMetaData.setOwner(ownableService.getOwner(status.getNodeRef()));
}
rowHandler.processResult(nodeMetaData);

View File

@@ -0,0 +1,62 @@
/* Copyright (C) 2005-2012 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
*/
package org.alfresco.util;
import java.io.File;
import java.io.IOException;
import java.util.AbstractMap;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Set;
import org.alfresco.error.AlfrescoRuntimeException;
import org.alfresco.util.OpenOfficeURI;
import org.alfresco.util.exec.RuntimeExec;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
/**
* A map giving the environment openoffice or libreoffice commands require to start.
*
* @author Alan Davis
*/
public class OpenOfficeCommandEnv extends AbstractMap<String, String>
{
private static final Log logger = LogFactory.getLog(OpenOfficeCommandLine.class);
private static final String DYLD_LIBRARY_PATH = "DYLD_LIBRARY_PATH";
private Map<String, String> map = new HashMap<String, String>(System.getenv());
private OpenOfficeVariant variant = new OpenOfficeVariant();
public OpenOfficeCommandEnv(String exe) throws IOException
{
if (variant.isMac())
{
map.remove(DYLD_LIBRARY_PATH);
logger.debug("Removing $DYLD_LIBRARY_PATH from the environment so that LibreOffice/OpenOffice will start on Mac.");
}
}
@Override
public Set<java.util.Map.Entry<String, String>> entrySet()
{
return map.entrySet();
}
}

View File

@@ -26,7 +26,6 @@ import java.util.List;
import java.util.Map;
import java.util.Set;
import org.alfresco.error.AlfrescoRuntimeException;
import org.alfresco.util.OpenOfficeURI;
import org.alfresco.util.exec.RuntimeExec;
import org.apache.commons.logging.Log;
@@ -40,30 +39,30 @@ import org.apache.commons.logging.LogFactory;
*/
public class OpenOfficeCommandLine extends AbstractMap<String, List<String>>
{
private static final String[] EXTENSIONS = new String[] {"", ".exe", ".com", ".bat", ".cmd"};
private static final Log logger = LogFactory.getLog(OpenOfficeCommandLine.class);
private static final String OS_NAME = System.getProperty("os.name").toLowerCase();
private Map<String, List<String>> map = new HashMap<String, List<String>>();
private boolean windows;
private OpenOfficeVariant variant = new OpenOfficeVariant();
public OpenOfficeCommandLine(String exe, String port, String user) throws IOException
{
windows = isWindows();
File executable = findExecutable(exe);
File officeHome = getOfficeHome(executable);
File executable = variant.findExecutable(exe);
File officeHome = variant.getOfficeHome(executable);
List<String> command = new ArrayList<String>();
String acceptValue = "socket,host=127.0.0.1,port="+port+";urp;StarOffice.ServiceManager";
String userInstallation = new OpenOfficeURI(user).toString();
command.add(executable == null ? exe : executable.getAbsolutePath());
if (isLibreOffice3Dot5(officeHome))
if (variant.isLibreOffice3Dot5(officeHome))
{
command.add("--accept=" + acceptValue);
command.add("-env:UserInstallation=" + userInstallation);
if (variant.isMac())
{
command.add("--env:UserInstallation=" + userInstallation);
}
else
{
command.add("-env:UserInstallation=" + userInstallation);
}
command.add("--headless");
command.add("--nocrashreport");
//command.add("--nodefault"); included by JOD
@@ -71,7 +70,7 @@ public class OpenOfficeCommandLine extends AbstractMap<String, List<String>>
//command.add("--nolockcheck"); included by JOD
command.add("--nologo");
command.add("--norestore");
logger.info("Using GNU based LibreOffice command: "+command);
logger.info("Using GNU based LibreOffice command"+(variant.isMac() ? " on Mac" : "")+": "+command);
}
else
{
@@ -89,100 +88,6 @@ public class OpenOfficeCommandLine extends AbstractMap<String, List<String>>
map.put(RuntimeExec.KEY_OS_DEFAULT, command);
}
private File getOfficeHome(File executable)
{
// Get the grandparent
File officeHome = executable;
for (int i=1; officeHome != null && i <= 2; i++)
{
officeHome = officeHome.getParentFile();
}
if (officeHome == null && executable != null)
{
throw new AlfrescoRuntimeException("Did not find OppenOffice home from executable "+executable.getAbsolutePath());
}
return officeHome;
}
private File findExecutable(String executableName)
{
File file = new File(executableName);
if (file.isAbsolute())
{
file = canExecute(file);
}
else
{
file = findExecutableOnPath(executableName);
}
return file;
}
private File findExecutableOnPath(String executableName)
{
String systemPath = System.getenv("PATH");
systemPath = systemPath == null ? System.getenv("path") : systemPath;
String[] pathDirs = systemPath.split(File.pathSeparator);
File fullyQualifiedExecutable = null;
for (String pathDir : pathDirs)
{
File file = canExecute(new File(pathDir, executableName));
if (file != null)
{
fullyQualifiedExecutable = file;
break;
}
}
return fullyQualifiedExecutable;
}
private File canExecute(File file)
{
File fullyQualifiedExecutable = null;
File dir = file.getParentFile();
String name = file.getName();
for (String ext: EXTENSIONS)
{
file = new File(dir, name+ext);
if (file.canExecute())
{
fullyQualifiedExecutable = file;
break;
}
if (!windows)
{
break;
}
}
return fullyQualifiedExecutable;
}
private boolean isLibreOffice3Dot5(File officeHome)
{
return
officeHome != null &&
!new File(officeHome, "basis-link").isFile() &&
new File(officeHome, "ure-link").isFile();
}
private static boolean isLinux()
{
return OS_NAME.startsWith("linux");
}
private static boolean isMac()
{
return OS_NAME.startsWith("mac");
}
private static boolean isWindows()
{
return OS_NAME.startsWith("windows");
}
@Override
public Set<java.util.Map.Entry<String, List<String>>> entrySet()
{

View File

@@ -0,0 +1,139 @@
/* Copyright (C) 2005-2012 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
*/
package org.alfresco.util;
import java.io.File;
import org.alfresco.error.AlfrescoRuntimeException;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
/**
* Provides OpenOffice and LibreOffice variant information.
*
* @author Alan Davis
*/
public class OpenOfficeVariant
{
private static final String[] EXTENSIONS = new String[] {"", ".exe", ".com", ".bat", ".cmd"};
private static final Log logger = LogFactory.getLog(OpenOfficeCommandLine.class);
private static final String OS_NAME = System.getProperty("os.name").toLowerCase();
private final boolean windows = isWindows();
public File getOfficeHome(File executable)
{
// Get the grandparent
File officeHome = executable;
for (int i=1; officeHome != null && i <= 2; i++)
{
officeHome = officeHome.getParentFile();
}
if (officeHome == null && executable != null)
{
throw new AlfrescoRuntimeException("Did not find OppenOffice home from executable "+executable.getAbsolutePath());
}
return officeHome;
}
public File findExecutable(String executableName)
{
File file = new File(executableName);
if (file.isAbsolute())
{
file = canExecute(file);
}
else
{
file = findExecutableOnPath(executableName);
}
return file;
}
private File findExecutableOnPath(String executableName)
{
String systemPath = System.getenv("PATH");
systemPath = systemPath == null ? System.getenv("path") : systemPath;
String[] pathDirs = systemPath.split(File.pathSeparator);
File fullyQualifiedExecutable = null;
for (String pathDir : pathDirs)
{
File file = canExecute(new File(pathDir, executableName));
if (file != null)
{
fullyQualifiedExecutable = file;
break;
}
}
return fullyQualifiedExecutable;
}
private File canExecute(File file)
{
File fullyQualifiedExecutable = null;
File dir = file.getParentFile();
String name = file.getName();
for (String ext: EXTENSIONS)
{
file = new File(dir, name+ext);
if (file.canExecute())
{
fullyQualifiedExecutable = file;
break;
}
if (!windows)
{
break;
}
}
return fullyQualifiedExecutable;
}
public boolean isLibreOffice3Dot5(File officeHome)
{
logger.debug("System.getProperty(\"os.name\")="+System.getProperty("os.name"));
logger.debug("officeHome="+(officeHome == null ? null : "'"+officeHome.getAbsolutePath()+"'"));
logger.debug("basis-link:"+new File(officeHome, "basis-link").isFile());
logger.debug(" ure-link:"+new File(officeHome, "ure-link").isFile());
logger.debug("basis-link:"+new File(officeHome, "basis-link").isDirectory());
logger.debug(" ure-link:"+new File(officeHome, "ure-link").isDirectory());
return
officeHome != null &&
!new File(officeHome, "basis-link").isFile() &&
(new File(officeHome, "ure-link").isFile() || new File(officeHome, "ure-link").isDirectory());
}
public boolean isLinux()
{
return OS_NAME.startsWith("linux");
}
public boolean isMac()
{
return OS_NAME.startsWith("mac");
}
public boolean isWindows()
{
return OS_NAME.startsWith("windows");
}
}

View File

@@ -0,0 +1,165 @@
/*
* Copyright (C) 2005-2012
Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
*/
package org.alfresco.util.test.junitrules;
import java.util.ArrayList;
import java.util.List;
import org.alfresco.repo.security.authentication.AuthenticationUtil;
import org.alfresco.repo.security.authentication.AuthenticationUtil.RunAsWork;
import org.alfresco.repo.site.SiteModel;
import org.alfresco.repo.transaction.RetryingTransactionHelper;
import org.alfresco.repo.transaction.RetryingTransactionHelper.RetryingTransactionCallback;
import org.alfresco.service.cmr.site.SiteInfo;
import org.alfresco.service.cmr.site.SiteService;
import org.alfresco.service.cmr.site.SiteVisibility;
import org.alfresco.service.namespace.QName;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.junit.rules.ExternalResource;
/**
* A JUnit rule designed to help with the automatic cleanup of temporary st:site nodes.
*
* @author Neil Mc Erlean
* @since 4.0.3
*/
public class TemporarySites extends ExternalResource
{
private static final Log log = LogFactory.getLog(TemporarySites.class);
private final ApplicationContextInit appContextRule;
private List<SiteInfo> temporarySites = new ArrayList<SiteInfo>();
/**
* Constructs the rule with a reference to a {@link ApplicationContextInit rule} which can be used to retrieve the ApplicationContext.
*
* @param appContextRule a rule which can be used to retrieve the spring app context.
*/
public TemporarySites(ApplicationContextInit appContextRule)
{
this.appContextRule = appContextRule;
}
@Override protected void before() throws Throwable
{
// Intentionally empty
}
@Override protected void after()
{
final RetryingTransactionHelper transactionHelper = (RetryingTransactionHelper) appContextRule.getApplicationContext().getBean("retryingTransactionHelper");
final SiteService siteService = appContextRule.getApplicationContext().getBean("siteService", SiteService.class);
// Run as admin to ensure all sites can be deleted irrespective of which user created them.
AuthenticationUtil.runAs(new RunAsWork<Void>()
{
@Override public Void doWork() throws Exception
{
transactionHelper.doInTransaction(new RetryingTransactionCallback<Void>()
{
@Override public Void execute() throws Throwable
{
for (SiteInfo site : temporarySites)
{
final String shortName = site.getShortName();
if (siteService.getSite(shortName) != null)
{
log.debug("Deleting temporary site " + shortName);
siteService.deleteSite(shortName);
}
}
return null;
}
});
return null;
}
}, AuthenticationUtil.getAdminUserName());
}
/**
* Add a specified site to the list of SiteInfos to be deleted by this rule.
*
* @param temporarySite a SiteInfo
*/
public void addSite(SiteInfo temporarySite)
{
this.temporarySites.add(temporarySite);
}
/**
* This method creates a Share Site and adds it to the internal list of NodeRefs to be tidied up by the rule.
* This method will be run in its own transaction and will be run with the specified user as the fully authenticated user,
* thus ensuring the named user is the creator of the new site.
*
* @param sitePreset the site preset
* @param siteShortName the short name of the new site
* @param siteTitle the title of the new site
* @param siteDescription the description of the new site
* @param visibility the visibility
* @param siteCreator the username of the person who will create the site
* @return the newly created SiteInfo (will be of type st:site).
*/
public SiteInfo createSite(final String sitePreset, final String siteShortName, final String siteTitle, final String siteDescription,
final SiteVisibility visibility, final String siteCreator)
{
return this.createSite(sitePreset, siteShortName, siteTitle, siteDescription, visibility, SiteModel.TYPE_SITE, siteCreator);
}
/**
* This method creates a Share Site (<b>or subtype</b>) and adds it to the internal list of NodeRefs to be tidied up by the rule.
* This method will be run in its own transaction and will be run with the specified user as the fully authenticated user,
* thus ensuring the named user is the creator of the new site.
*
* @param sitePreset the site preset
* @param siteShortName the short name of the new site
* @param siteTitle the title of the new site
* @param siteDescription the description of the new site
* @param visibility the visibility
* @param node type the node type of the site (must be st:site or subtype)
* @param siteCreator the username of the person who will create the site
* @return the newly created SiteInfo.
*/
public SiteInfo createSite(final String sitePreset, final String siteShortName, final String siteTitle, final String siteDescription,
final SiteVisibility visibility, final QName siteType, final String siteCreator)
{
final RetryingTransactionHelper transactionHelper = appContextRule.getApplicationContext().getBean("retryingTransactionHelper", RetryingTransactionHelper.class);
AuthenticationUtil.pushAuthentication();
AuthenticationUtil.setFullyAuthenticatedUser(siteCreator);
SiteInfo newSite = transactionHelper.doInTransaction(new RetryingTransactionCallback<SiteInfo>()
{
public SiteInfo execute() throws Throwable
{
final SiteService siteService = appContextRule.getApplicationContext().getBean("siteService", SiteService.class);
return siteService.createSite(sitePreset, siteShortName, siteTitle, siteDescription, visibility, siteType);
}
});
AuthenticationUtil.popAuthentication();
this.temporarySites.add(newSite);
return newSite;
}
}

View File

@@ -0,0 +1,111 @@
/*
* Copyright (C) 2005-2012 Alfresco Software Limited.
*
* This file is part of Alfresco
*
* Alfresco is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Alfresco is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
*/
package org.alfresco.util.test.junitrules;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertNotNull;
import org.alfresco.repo.security.authentication.AuthenticationUtil;
import org.alfresco.repo.transaction.RetryingTransactionHelper;
import org.alfresco.repo.transaction.RetryingTransactionHelper.RetryingTransactionCallback;
import org.alfresco.service.cmr.site.SiteInfo;
import org.alfresco.service.cmr.site.SiteService;
import org.alfresco.service.cmr.site.SiteVisibility;
import org.alfresco.service.namespace.NamespaceService;
import org.alfresco.service.namespace.QName;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.ClassRule;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.RuleChain;
/**
* Test class for {@link TemporarySites}.
*
* @author Neil McErlean
* @since 4.0.3
*/
public class TemporarySitesTest
{
// Rule to initialise the default Alfresco spring configuration
public static ApplicationContextInit APP_CONTEXT_INIT =
ApplicationContextInit.createStandardContextWithOverrides("classpath:sites/test-"
+ TemporarySitesTest.class.getSimpleName() + "-context.xml");
// A rule to manage test sites reused across all the test methods
public static TemporaryNodes STATIC_TEST_SITES = new TemporaryNodes(APP_CONTEXT_INIT);
// Tie them together in a static Rule Chain
@ClassRule public static RuleChain ruleChain = RuleChain.outerRule(APP_CONTEXT_INIT)
.around(STATIC_TEST_SITES);
// A rule to manage test sites use in each test method
@Rule public TemporarySites testSites = new TemporarySites(APP_CONTEXT_INIT);
// A rule to allow individual test methods all to be run as "admin".
@Rule public RunAsFullyAuthenticatedRule runAsRule = new RunAsFullyAuthenticatedRule(AuthenticationUtil.getAdminUserName());
// Various services
private static NamespaceService NAMESPACE_SERVICE;
private static SiteService SITE_SERVICE;
private static RetryingTransactionHelper TRANSACTION_HELPER;
// These SiteInfos are used by the test methods.
private SiteInfo testSite1, testSite2;
@BeforeClass public static void initStaticData() throws Exception
{
NAMESPACE_SERVICE = APP_CONTEXT_INIT.getApplicationContext().getBean("namespaceService", NamespaceService.class);
SITE_SERVICE = APP_CONTEXT_INIT.getApplicationContext().getBean("siteService", SiteService.class);
TRANSACTION_HELPER = APP_CONTEXT_INIT.getApplicationContext().getBean("retryingTransactionHelper", RetryingTransactionHelper.class);
}
@Before public void createTestContent()
{
// Create some test content
testSite1 = testSites.createSite("sitePreset", "testSite1", "t", "d", SiteVisibility.PUBLIC, AuthenticationUtil.getAdminUserName());
final QName subSiteType = QName.createQName("testsite", "testSubsite", NAMESPACE_SERVICE);
testSite2 = testSites.createSite("sitePreset", "testSite2", "T", "D", SiteVisibility.PUBLIC, subSiteType, AuthenticationUtil.getAdminUserName());
}
@Test public void ensureTestSitesWereCreatedOk() throws Exception
{
TRANSACTION_HELPER.doInTransaction(new RetryingTransactionCallback<Void>()
{
public Void execute() throws Throwable
{
final SiteInfo recoveredSite1 = SITE_SERVICE.getSite(testSite1.getShortName());
final SiteInfo recoveredSite2 = SITE_SERVICE.getSite(testSite2.getShortName());
assertNotNull("Test site does not exist", recoveredSite1);
assertNotNull("Test site does not exist", recoveredSite2);
assertEquals("cm:title was wrong", "t", recoveredSite1.getTitle());
assertEquals("cm:description was wrong", "d", recoveredSite1.getDescription());
assertEquals("preset was wrong", "sitePreset", recoveredSite1.getSitePreset());
assertEquals("site visibility was wrong", SiteVisibility.PUBLIC, recoveredSite1.getVisibility());
return null;
}
});
}
}

View File

@@ -0,0 +1,14 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE beans PUBLIC '-//SPRING//DTD BEAN//EN' 'http://www.springframework.org/dtd/spring-beans.dtd'>
<beans>
<bean id="test.sites.dictionaryBootstrap" parent="dictionaryModelBootstrap" depends-on="dictionaryBootstrap">
<property name="models">
<list>
<value>sites/testSiteModel.xml</value>
</list>
</property>
</bean>
</beans>

View File

@@ -0,0 +1,26 @@
<?xml version="1.0" encoding="UTF-8"?>
<model name="testsite:testSiteModel" xmlns="http://www.alfresco.org/model/dictionary/1.0">
<description>Test Site Model</description>
<author>Alfresco</author>
<version>1.0</version>
<imports>
<import uri="http://www.alfresco.org/model/dictionary/1.0" prefix="d"/>
<import uri="http://www.alfresco.org/model/content/1.0" prefix="cm"/>
<import uri="http://www.alfresco.org/model/system/1.0" prefix="sys" />
<import uri="http://www.alfresco.org/model/site/1.0" prefix="st" />
</imports>
<namespaces>
<namespace uri="http://www.alfresco.org/model/testsite/1.0" prefix="testsite"/>
</namespaces>
<types>
<type name="testsite:testSubsite">
<title>Test Sub-site</title>
<parent>st:site</parent>
</type>
</types>
</model>