mirror of
https://github.com/Alfresco/alfresco-community-repo.git
synced 2025-08-14 17:58:59 +00:00
40605: ALF-15273: Merged PATCHES/V4.0.1 to V4.1-BUG-FIX 40132: ALF-15376: Activiti schema updates fail when hibernate.default_schema is set with ORA-00942. - Corrected schema case to uppercase when database is Oracle. 40235: ALF-15367: Reverse merged the following revisions because the fix is deemed not robust enough. 40132: ALF-15376: Activiti schema updates fail when hibernate.default_schema is set with ORA-00942. - Corrected schema case to uppercase when database is Oracle. 40041: ALF-15376: Merged V4.1-BUG-FIX to PATCHES/V4.0.1 39969: Merged DEV/BELARUS-V4.1-BUG-FIX-2012_07_09 to V4.1-BUG-FIX: ALF-15273 : Activiti schema updates fail when hibernate.default_schema is set with ORA-00942. The Activiti database is now correctly initialized with the correct "hibernate.default_schema" 40470: ALF-15376: how to debug the creation of Activiti tables (ACT_) when upgrading to 4.X - added more logging to Activiti schema creation 40471: ALF-15376: Activiti schema updates fail when hibernate.default_schema is set with ORA-00942 - Ignore hibernate.default_schema and determine default schema from the Connection DatabaseMetaData - Provided the Activiti schema initializer with the default schema information - Provided countAppliedPatches() with default schema information 40501: ALF-15376: Improved webapp logging.properties to use a console handler so that it doesn't suppress absolutely everything and we can selectively turn on logging! 40608: Fix for ALF-4274 - JSF - Paste action does not work when browse.jsp is overrided 40611: GERMAN: Translation updates based on EN r40604 40612: SPANISH: Translation updates based on EN r40604 40613: FRENCH: Translation updates based on EN r40604 40614: ITALIAN: Translation updates based on EN r40604 40615: JAPANESE: Translation updates based on EN r40604 40616: DUTCH: Translation updates based on EN r40604 40617: CHINESE: Translation updates based on EN r40604 40629: ALF-15321: upgrade Activiti to fix logging 40632: Fix for ALF-15487 Search not working for queries containing 3-digit versions Fix for ALF-15356 SOLR doesn't support searching by cm:name of file with underscore and dots 40655: Fix for ALF-14752 - Collapse Links part at the WCM details page lead to error. 40662: Eclipse classpath fixes 40663: Merged DEV to V4.1-BUG-FIX 40661: ALF-15318 (part 2): It's possible to log in by disabled user (NTLM with SSO in a clustered env) The onValidateFailed() methods were moved to BaseSSOAuthenticationFilter to response with a 401 for a disabled user. 40665: ALF-15448: Merged V3.4-BUG-FIX (3.4.11) to V4.1-BUG-FIX (4.1.1) 40664: ALF-15578 CLONE 3.4.11: LibreOffice 3.6 startup on Mac fails 40685: Merged PATCHES/V4.0.2 to V4.1-BUG-FIX 39274: Merged DEV to V4.0.2 (4.0.2.4) << Unable to merge code as supplied as it introduced a change to a public API, which would break alfresco.log if the RM AMP was installed See RM-452 >> 39166: ALF-15583 / ALF-14584: autoVersionOnUpdateProps=true does not increment the version label after checkout/checkin 'VersionableAspectTest' has been modified in accordance with concept: several modifications of node in a single transaction are interpreted as a single version. Each operation in the test which should provide a new version have been made atomic 39089: ALF-15583 / ALF-14584: autoVersionOnUpdateProps=true does not increment the version label after checkout/checkin Check of lock has been corrected since 'cm:lockable' aspect doesn't indicate lock state: - 'LockService' service has been extended with 'isLocked(NodRef)' method which returns 'true' if document is locked and current user is not an owner of the lock; - new 'VersionableAspectTest' has been added to test the use-case described in the issue and to test whether 'VersionableAspect' changes version label of a locked document 39369: ALF-15583 / ALF-14584 autoVersionOnUpdateProps=true does not increment the version label after checkout/checkin - Test failures: A READ_ONLY lock was being set because we are adding a versionable aspect. This resulted in an Exception when attempting to update the version. Change made to the isLocked method (now called isLockedOrReadOnly) to reflect that a node is locked even for the owner and the lock type is not a WRITE lock. 39939: ALF-15584 / ALF-15001: Gracefully handle stale NodeRefs in query results in DMDiscoveryServicePort - SOLR makes this more likely to occur 40455: ALF-15585 / ALF-15383: Long-running Feed Cleaner - Part 1: Limit problems caused by missing indexes - Remove all count calls - Remove logic requiring calls to SiteService to list all sites - Added in an ID range limit to act as a hard-stop to entry growth (set to 1M) - TODO: use JobLockService 40461: ALF-15585 / ALF-15383: Long running Feed Cleaner - Part 2: Added JobLockService usage to ensure only one instance runs at a time 40463: ALF-15585 / ALF-15383: Long running Feed Cleaner - A bit more trace and debug 40526: ALF-15586: Fixed ALF-15540: CMIS: Synchronized block in service interceptor 40574: ALF-15585 / ALF-15383: Long running Feed Cleaner - Fix MySQL variant of activities-common-SqlMap 40579: ALF-15585: Fix fallout from rev 40455: ALF-15383: Long-running Feed Cleaner - MySQL dialect was duplicating ALL SQL statements - Split 'large' SQL selects into activities-select-SqlMap.xml containing 7 statements that are all overridden for MySQL - Fixed split in common file between different types of statements 40588: ALF-15587 / ALF-15385: Merged V3.4-BUG-FIX to PATCHES/V4.0.2 (Lost revision) 28830: ALF-7622 Refactored JScriptWorkflowTask. Now when setProperties() is called it properly updates the WorkflowTask properties via the WorflowService.updateTask() method. 40687: Merged V3.4-BUG-FIX to V4.1-BUG-FIX 40599: ALF-15567: Merged PATCHES/V3.4.10 to V3.4-BUG-FIX 40511: ALF-12008: Merged DEV to PATCHES/V3.4.10 Due to Windows Explorer's URL concatenation behaviour, we must present links as shortcuts to the real URL, rather than direct hrefs. This is at least consistent with the way the CIFS server handles links. See org.alfresco.filesys.repo.ContentDiskDriver.openFile(). 40518: ALF-12008: Fixed compilation error git-svn-id: https://svn.alfresco.com/repos/alfresco-enterprise/alfresco/HEAD/root@40691 c4b6b30b-aa2e-2d43-bbcb-ca4b014f7261
654 lines
25 KiB
Java
654 lines
25 KiB
Java
/*
|
|
* Copyright (C) 2005-2010 Alfresco Software Limited.
|
|
*
|
|
* This file is part of Alfresco
|
|
*
|
|
* Alfresco is free software: you can redistribute it and/or modify
|
|
* it under the terms of the GNU Lesser General Public License as published by
|
|
* the Free Software Foundation, either version 3 of the License, or
|
|
* (at your option) any later version.
|
|
*
|
|
* Alfresco is distributed in the hope that it will be useful,
|
|
* but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
* GNU Lesser General Public License for more details.
|
|
*
|
|
* You should have received a copy of the GNU Lesser General Public License
|
|
* along with Alfresco. If not, see <http://www.gnu.org/licenses/>.
|
|
*/
|
|
package org.alfresco.repo.webdav;
|
|
|
|
import java.io.IOException;
|
|
import java.io.Writer;
|
|
import java.text.ParseException;
|
|
import java.text.SimpleDateFormat;
|
|
import java.util.ArrayList;
|
|
import java.util.Date;
|
|
import java.util.List;
|
|
import java.util.StringTokenizer;
|
|
|
|
import javax.servlet.http.HttpServletResponse;
|
|
|
|
import org.alfresco.model.ContentModel;
|
|
import org.alfresco.repo.content.filestore.FileContentReader;
|
|
import org.alfresco.repo.web.util.HttpRangeProcessor;
|
|
import org.alfresco.service.cmr.model.FileFolderService;
|
|
import org.alfresco.service.cmr.model.FileInfo;
|
|
import org.alfresco.service.cmr.model.FileNotFoundException;
|
|
import org.alfresco.service.cmr.repository.ContentReader;
|
|
import org.alfresco.service.cmr.repository.MimetypeService;
|
|
import org.alfresco.service.cmr.repository.NodeRef;
|
|
import org.alfresco.service.cmr.repository.Path;
|
|
import org.alfresco.service.cmr.repository.datatype.DefaultTypeConverter;
|
|
import org.alfresco.service.cmr.repository.datatype.TypeConverter;
|
|
import org.springframework.extensions.surf.util.I18NUtil;
|
|
|
|
/**
|
|
* Implements the WebDAV GET method
|
|
*
|
|
* @author gavinc
|
|
*/
|
|
public class GetMethod extends WebDAVMethod
|
|
{
|
|
// Request parameters
|
|
|
|
private static final String RANGE_HEADER_UNIT_SPECIFIER = "bytes=";
|
|
private ArrayList<String> ifMatchTags = null;
|
|
private ArrayList<String> ifNoneMatchTags = null;
|
|
private Date m_ifModifiedSince = null;
|
|
private Date m_ifUnModifiedSince = null;
|
|
|
|
protected boolean m_returnContent = true;
|
|
private String byteRanges;
|
|
|
|
/**
|
|
* Default constructor
|
|
*/
|
|
public GetMethod()
|
|
{
|
|
}
|
|
|
|
/**
|
|
* Parse the request headers
|
|
*
|
|
* @exception WebDAVServerException
|
|
*/
|
|
protected void parseRequestHeaders() throws WebDAVServerException
|
|
{
|
|
// If the range header is present output a warning, add support later
|
|
|
|
String strRange = m_request.getHeader(WebDAV.HEADER_RANGE);
|
|
|
|
if (strRange != null && strRange.length() > 0)
|
|
{
|
|
byteRanges = strRange;
|
|
if (logger.isDebugEnabled())
|
|
{
|
|
logger.debug("Range header supplied: " + byteRanges);
|
|
}
|
|
}
|
|
|
|
// Capture all the If headers, process later
|
|
|
|
String strIfMatch = m_request.getHeader(WebDAV.HEADER_IF_MATCH);
|
|
|
|
if (strIfMatch != null && strIfMatch.length() > 0)
|
|
{
|
|
ifMatchTags = parseETags(strIfMatch);
|
|
}
|
|
|
|
String strIfNoneMatch = m_request.getHeader(WebDAV.HEADER_IF_NONE_MATCH);
|
|
if (strIfNoneMatch != null && strIfNoneMatch.length() > 0)
|
|
{
|
|
ifNoneMatchTags = parseETags(strIfNoneMatch);
|
|
}
|
|
|
|
// Parse the dates
|
|
|
|
SimpleDateFormat dateFormat = new SimpleDateFormat(WebDAV.HEADER_IF_DATE_FORMAT);
|
|
String strIfModifiedSince = m_request.getHeader(WebDAV.HEADER_IF_MODIFIED_SINCE);
|
|
|
|
if (strIfModifiedSince != null && strIfModifiedSince.length() > 0)
|
|
{
|
|
try
|
|
{
|
|
m_ifModifiedSince = dateFormat.parse(strIfModifiedSince);
|
|
}
|
|
catch (ParseException e)
|
|
{
|
|
logger.warn("Failed to parse If-Modified-Since date of " + strIfModifiedSince);
|
|
}
|
|
}
|
|
|
|
String strIfUnModifiedSince = m_request.getHeader(WebDAV.HEADER_IF_UNMODIFIED_SINCE);
|
|
if (strIfUnModifiedSince != null && strIfUnModifiedSince.length() > 0)
|
|
{
|
|
try
|
|
{
|
|
m_ifUnModifiedSince = dateFormat.parse(strIfUnModifiedSince);
|
|
}
|
|
catch (ParseException e)
|
|
{
|
|
logger.warn("Failed to parse If-Unmodified-Since date of " + strIfUnModifiedSince);
|
|
}
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Parse the request body
|
|
*
|
|
* @exception WebDAVServerException
|
|
*/
|
|
protected void parseRequestBody() throws WebDAVServerException
|
|
{
|
|
// Nothing to do in this method
|
|
}
|
|
|
|
/**
|
|
* @return Returns <tt>true</tt> always
|
|
*/
|
|
@Override
|
|
protected boolean isReadOnly()
|
|
{
|
|
return true;
|
|
}
|
|
|
|
/**
|
|
* Exceute the WebDAV request
|
|
*
|
|
* @exception WebDAVServerException
|
|
*/
|
|
protected void executeImpl() throws WebDAVServerException, Exception
|
|
{
|
|
FileFolderService fileFolderService = getFileFolderService();
|
|
NodeRef rootNodeRef = getRootNodeRef();
|
|
String path = getPath();
|
|
String servletPath = getServletPath();
|
|
|
|
FileInfo nodeInfo = null;
|
|
try
|
|
{
|
|
nodeInfo = getDAVHelper().getNodeForPath(rootNodeRef, path, servletPath);
|
|
}
|
|
catch (FileNotFoundException e)
|
|
{
|
|
throw new WebDAVServerException(HttpServletResponse.SC_NOT_FOUND);
|
|
}
|
|
|
|
FileInfo realNodeInfo = nodeInfo;
|
|
|
|
// ALF-12008: Due to Windows Explorer's URL concatenation behaviour, we must present links as shortcuts to the real URL, rather than direct hrefs
|
|
// This is at least consistent with the way the CIFS server handles links. See org.alfresco.filesys.repo.ContentDiskDriver.openFile().
|
|
if (realNodeInfo.isLink())
|
|
{
|
|
Path pathToNode = getNodeService().getPath(nodeInfo.getLinkNodeRef());
|
|
if (pathToNode.size() > 2)
|
|
{
|
|
pathToNode = pathToNode.subPath(2, pathToNode.size() -1);
|
|
}
|
|
|
|
String rootURL = WebDAV.getURLForPath(m_request, pathToNode.toDisplayPath(getNodeService(), getPermissionService()), true);
|
|
if (rootURL.endsWith(WebDAVHelper.PathSeperator) == false)
|
|
{
|
|
rootURL = rootURL + WebDAVHelper.PathSeperator;
|
|
}
|
|
|
|
String fname = (String) getNodeService().getProperty(nodeInfo.getLinkNodeRef(), ContentModel.PROP_NAME);
|
|
String webDavUrl = m_request.getServerName() + ":" + m_request.getServerPort() + rootURL + WebDAVHelper.encodeURL(fname, m_userAgent);
|
|
|
|
StringBuilder urlStr = new StringBuilder();
|
|
urlStr.append("[InternetShortcut]\r\n");
|
|
urlStr.append("URL=file://");
|
|
urlStr.append(webDavUrl);
|
|
urlStr.append("\r\n");
|
|
|
|
m_response.setHeader(WebDAV.HEADER_CONTENT_TYPE, "text/plain; charset=ISO-8859-1");
|
|
m_response.setHeader(WebDAV.HEADER_CONTENT_LENGTH, String.valueOf(urlStr.length()));
|
|
m_response.getWriter().write(urlStr.toString());
|
|
}
|
|
// Check if the node is a folder
|
|
else if (realNodeInfo.isFolder())
|
|
{
|
|
// is content required
|
|
if (!m_returnContent)
|
|
{
|
|
// ALF-7883 fix, HEAD for collection (see http://www.webdav.org/specs/rfc2518.html#rfc.section.8.4)
|
|
return;
|
|
}
|
|
|
|
// Generate a folder listing
|
|
m_response.setContentType("text/html;charset=UTF-8");
|
|
generateDirectoryListing(nodeInfo);
|
|
}
|
|
else
|
|
{
|
|
// Return the node details, and content if requested, check that the node passes the pre-conditions
|
|
|
|
checkPreConditions(realNodeInfo);
|
|
|
|
// Build the response header
|
|
m_response.setHeader(WebDAV.HEADER_ETAG, getDAVHelper().makeQuotedETag(nodeInfo));
|
|
|
|
Date modifiedDate = realNodeInfo.getModifiedDate();
|
|
if (modifiedDate != null)
|
|
{
|
|
long modDate = DefaultTypeConverter.INSTANCE.longValue(modifiedDate);
|
|
m_response.setHeader(WebDAV.HEADER_LAST_MODIFIED, WebDAV.formatHeaderDate(modDate));
|
|
}
|
|
|
|
ContentReader reader = fileFolderService.getReader(realNodeInfo.getNodeRef());
|
|
// ensure that we generate something, even if the content is missing
|
|
reader = FileContentReader.getSafeContentReader(
|
|
(ContentReader) reader,
|
|
I18NUtil.getMessage(FileContentReader.MSG_MISSING_CONTENT),
|
|
realNodeInfo.getNodeRef(), reader);
|
|
|
|
if (byteRanges != null && byteRanges.startsWith(RANGE_HEADER_UNIT_SPECIFIER))
|
|
{
|
|
HttpRangeProcessor rangeProcessor = new HttpRangeProcessor(getContentService());
|
|
String userAgent = m_request.getHeader(WebDAV.HEADER_USER_AGENT);
|
|
|
|
if (m_returnContent)
|
|
{
|
|
rangeProcessor.processRange(
|
|
m_response,
|
|
reader,
|
|
byteRanges.substring(6),
|
|
realNodeInfo.getNodeRef(),
|
|
ContentModel.PROP_CONTENT,
|
|
reader.getMimetype(),
|
|
userAgent);
|
|
}
|
|
}
|
|
else
|
|
{
|
|
// there is content associated with the node
|
|
m_response.setHeader(WebDAV.HEADER_CONTENT_LENGTH, Long.toString(reader.getSize()));
|
|
m_response.setHeader(WebDAV.HEADER_CONTENT_TYPE, reader.getMimetype());
|
|
|
|
if (m_returnContent)
|
|
{
|
|
// copy the content to the response output stream
|
|
reader.getContent(m_response.getOutputStream());
|
|
}
|
|
}
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Checks the If header conditions
|
|
*
|
|
* @param nodeInfo the node to check
|
|
* @throws WebDAVServerException if a pre-condition is not met
|
|
*/
|
|
private void checkPreConditions(FileInfo nodeInfo) throws WebDAVServerException
|
|
{
|
|
// Make an etag for the node
|
|
|
|
String strETag = getDAVHelper().makeQuotedETag(nodeInfo);
|
|
TypeConverter typeConv = DefaultTypeConverter.INSTANCE;
|
|
|
|
// Check the If-Match header, don't send any content back if none of the tags in
|
|
// the list match the etag, and the wildcard is not present
|
|
|
|
if (ifMatchTags != null)
|
|
{
|
|
if (ifMatchTags.contains(WebDAV.ASTERISK) == false && ifMatchTags.contains(strETag) == false)
|
|
{
|
|
throw new WebDAVServerException(HttpServletResponse.SC_PRECONDITION_FAILED);
|
|
}
|
|
}
|
|
|
|
// Check the If-None-Match header, don't send any content back if any of the tags
|
|
// in the list match the etag, or the wildcard is present
|
|
|
|
if (ifNoneMatchTags != null)
|
|
{
|
|
if (ifNoneMatchTags.contains(WebDAV.ASTERISK) || ifNoneMatchTags.contains(strETag))
|
|
{
|
|
throw new WebDAVServerException(HttpServletResponse.SC_NOT_MODIFIED);
|
|
}
|
|
}
|
|
|
|
// Check the modified since list, if the If-None-Match header was not specified
|
|
|
|
if (m_ifModifiedSince != null && ifNoneMatchTags == null)
|
|
{
|
|
Date lastModifiedDate = nodeInfo.getModifiedDate();
|
|
|
|
long fileLastModified = lastModifiedDate != null ? typeConv.longValue(lastModifiedDate) : 0L;
|
|
long modifiedSince = m_ifModifiedSince.getTime();
|
|
|
|
if (fileLastModified != 0L && fileLastModified <= modifiedSince)
|
|
{
|
|
throw new WebDAVServerException(HttpServletResponse.SC_NOT_MODIFIED);
|
|
}
|
|
}
|
|
|
|
// Check the un-modified since list
|
|
|
|
if (m_ifUnModifiedSince != null)
|
|
{
|
|
Date lastModifiedDate = nodeInfo.getModifiedDate();
|
|
|
|
long fileLastModified = lastModifiedDate != null ? typeConv.longValue(lastModifiedDate) : 0L;
|
|
long unModifiedSince = m_ifUnModifiedSince.getTime();
|
|
|
|
if (fileLastModified >= unModifiedSince)
|
|
{
|
|
throw new WebDAVServerException(HttpServletResponse.SC_PRECONDITION_FAILED);
|
|
}
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Parses the given ETag header into a list of separate ETags
|
|
*
|
|
* @param strETagHeader The header to parse
|
|
* @return A list of ETags
|
|
*/
|
|
private ArrayList<String> parseETags(String strETagHeader)
|
|
{
|
|
ArrayList<String> list = new ArrayList<String>();
|
|
|
|
StringTokenizer tokenizer = new StringTokenizer(strETagHeader, WebDAV.HEADER_VALUE_SEPARATOR);
|
|
while (tokenizer.hasMoreTokens())
|
|
{
|
|
list.add(tokenizer.nextToken().trim());
|
|
}
|
|
|
|
return list;
|
|
}
|
|
|
|
/**
|
|
* Generates a HTML representation of the contents of the path represented by the given node
|
|
*
|
|
* @param fileInfo the file to use
|
|
*/
|
|
private void generateDirectoryListing(FileInfo fileInfo)
|
|
{
|
|
FileFolderService fileFolderService = getFileFolderService();
|
|
MimetypeService mimeTypeService = getMimetypeService();
|
|
|
|
Writer writer = null;
|
|
|
|
try
|
|
{
|
|
writer = m_response.getWriter();
|
|
|
|
boolean wasLink = false;
|
|
if (fileInfo.isLink())
|
|
{
|
|
fileInfo = getFileFolderService().getFileInfo(fileInfo.getLinkNodeRef());
|
|
wasLink = true;
|
|
}
|
|
// Get the list of child nodes for the parent node
|
|
List<FileInfo> childNodeInfos = fileFolderService.list(fileInfo.getNodeRef());
|
|
|
|
// Send back the start of the HTML
|
|
writer.write("<html><head><title>");
|
|
writer.write(WebDAVHelper.encodeHTML(I18NUtil.getMessage("webdav.repository_title")));
|
|
writer.write("</title>");
|
|
writer.write("<style>");
|
|
writer.write("body { font-family: Arial, Helvetica; font-size: 12pt; background-color: white; }\n");
|
|
writer.write("table { font-family: Arial, Helvetica; font-size: 12pt; background-color: white; }\n");
|
|
writer.write(".listingTable { border: solid black 1px; }\n");
|
|
writer.write(".textCommand { font-family: verdana; font-size: 10pt; }\n");
|
|
writer.write(".textLocation { font-family: verdana; font-size: 11pt; font-weight: bold; color: #2a568f; }\n");
|
|
writer.write(".textData { font-family: verdana; font-size: 10pt; }\n");
|
|
writer.write(".tableHeading { font-family: verdana; font-size: 10pt; font-weight: bold; color: white; background-color: #2a568f; }\n");
|
|
writer.write(".rowOdd { background-color: #eeeeee; }\n");
|
|
writer.write(".rowEven { background-color: #dddddd; }\n");
|
|
writer.write("</style></head>\n");
|
|
writer.flush();
|
|
|
|
// Send back the table heading
|
|
writer.write("<body>\n");
|
|
writer.write("<table cellspacing='2' cellpadding='3' border='0' width='100%'>\n");
|
|
writer.write("<tr><td colspan='4' class='textLocation'>");
|
|
writer.write(WebDAVHelper.encodeHTML(I18NUtil.getMessage("webdav.directory_listing")));
|
|
writer.write(' ');
|
|
writer.write(WebDAVHelper.encodeHTML(getPath()));
|
|
writer.write("</td></tr>\n");
|
|
writer.write("<tr><td height='10' colspan='4'></td></tr></table>");
|
|
|
|
writer.write("<table cellspacing='2' cellpadding='3' border='0' width='100%' class='listingTable'>\n");
|
|
writer.write("<tr><td class='tableHeading' width='*'>");
|
|
writer.write(WebDAVHelper.encodeHTML(I18NUtil.getMessage("webdav.column.name")));
|
|
writer.write("</td>");
|
|
writer.write("<td class='tableHeading' width='10%'>");
|
|
writer.write(WebDAVHelper.encodeHTML(I18NUtil.getMessage("webdav.column.size")));
|
|
writer.write("</td>");
|
|
writer.write("<td class='tableHeading' width='20%'>");
|
|
writer.write(WebDAVHelper.encodeHTML(I18NUtil.getMessage("webdav.column.type")));
|
|
writer.write("</td>");
|
|
writer.write("<td class='tableHeading' width='25%'>");
|
|
writer.write(WebDAVHelper.encodeHTML(I18NUtil.getMessage("webdav.column.modifieddate")));
|
|
writer.write("</td>");
|
|
writer.write("</tr>\n");
|
|
|
|
// Get the URL for the root path
|
|
String rootURL = getURLForPath(m_request, getPath(), true);
|
|
if (rootURL.endsWith(WebDAVHelper.PathSeperator) == false)
|
|
{
|
|
rootURL = rootURL + WebDAVHelper.PathSeperator;
|
|
}
|
|
if (wasLink)
|
|
{
|
|
Path pathToNode = getNodeService().getPath(fileInfo.getNodeRef());
|
|
if (pathToNode.size() > 2)
|
|
{
|
|
pathToNode = pathToNode.subPath(2, pathToNode.size() - 1);
|
|
}
|
|
|
|
rootURL = getURLForPath(m_request, pathToNode.toDisplayPath(getNodeService(), getPermissionService()), true);
|
|
if (rootURL.endsWith(WebDAVHelper.PathSeperator) == false)
|
|
{
|
|
rootURL = rootURL + WebDAVHelper.PathSeperator;
|
|
}
|
|
|
|
rootURL = rootURL + WebDAVHelper.encodeURL(fileInfo.getName(), m_userAgent) + WebDAVHelper.PathSeperator;
|
|
}
|
|
// Start with a link to the parent folder so we can navigate back up, unless we are at the root level
|
|
if (fileInfo.getNodeRef().equals(getRootNodeRef()) == false)
|
|
{
|
|
writer.write("<tr class='rowOdd'>");
|
|
writer.write("<td colspan='4' class='textData'><a href=\"");
|
|
|
|
// Strip the last folder from the path
|
|
String parentFolderUrl = parentFolder(rootURL);
|
|
writer.write(parentFolderUrl);
|
|
|
|
writer.write("\">");
|
|
writer.write("[");
|
|
writer.write(WebDAVHelper.encodeHTML(I18NUtil.getMessage("webdav.column.navigate_up")));
|
|
writer.write("]</a>");
|
|
writer.write("</tr>\n");
|
|
}
|
|
|
|
// Send back what we have generated so far
|
|
writer.flush();
|
|
int rowId = 0;
|
|
|
|
for (FileInfo childNodeInfo : childNodeInfos)
|
|
{
|
|
// Output the details for the current node
|
|
writer.write("<tr class='");
|
|
if ((rowId++ & 1) == 1)
|
|
{
|
|
writer.write("rowOdd");
|
|
}
|
|
else
|
|
{
|
|
writer.write("rowEven");
|
|
}
|
|
writer.write("'><td class='textData'><a href=\"");
|
|
writer.write(rootURL);
|
|
|
|
// name field
|
|
String fname = childNodeInfo.getName();
|
|
|
|
writer.write(WebDAVHelper.encodeURL(fname, m_userAgent));
|
|
writer.write("\">");
|
|
writer.write(WebDAVHelper.encodeHTML(fname));
|
|
writer.write("</a>");
|
|
|
|
// size field
|
|
writer.write("</td><td class='textData'>");
|
|
if (childNodeInfo.isFolder())
|
|
{
|
|
writer.write(" ");
|
|
}
|
|
else
|
|
{
|
|
ContentReader reader = fileFolderService.getReader(childNodeInfo.getNodeRef());
|
|
long fsize = 0L;
|
|
if (reader != null)
|
|
{
|
|
fsize = reader.getSize();
|
|
}
|
|
writer.write(formatSize(Long.toString(fsize)));
|
|
}
|
|
writer.write("</td><td class='textData'>");
|
|
|
|
// mimetype field
|
|
if (childNodeInfo.isFolder())
|
|
{
|
|
writer.write(" ");
|
|
}
|
|
else
|
|
{
|
|
ContentReader reader = fileFolderService.getReader(childNodeInfo.getNodeRef());
|
|
String mimetype = " ";
|
|
if (reader != null)
|
|
{
|
|
mimetype = reader.getMimetype();
|
|
String displayType = mimeTypeService.getDisplaysByMimetype().get(reader.getMimetype());
|
|
|
|
if (displayType != null)
|
|
{
|
|
mimetype = displayType;
|
|
}
|
|
if (mimetype == null)
|
|
{
|
|
mimetype = reader.getMimetype();
|
|
}
|
|
}
|
|
writer.write(mimetype);
|
|
}
|
|
writer.write("</td><td class='textData'>");
|
|
|
|
// modified date field
|
|
Date modifiedDate = childNodeInfo.getModifiedDate();
|
|
if (modifiedDate != null)
|
|
{
|
|
writer.write(WebDAV.formatHeaderDate(DefaultTypeConverter.INSTANCE.longValue(modifiedDate)));
|
|
}
|
|
else
|
|
{
|
|
writer.write(" ");
|
|
}
|
|
writer.write("</td></tr>\n");
|
|
|
|
// flush every few rows
|
|
if ((rowId & 15) == 0)
|
|
{
|
|
writer.flush();
|
|
}
|
|
}
|
|
|
|
writer.write("</table></body></html>");
|
|
}
|
|
catch (Throwable e)
|
|
{
|
|
logger.error(e);
|
|
|
|
if (writer != null)
|
|
{
|
|
try
|
|
{
|
|
writer.write("</table><table><tr><td style='color:red'>");
|
|
writer.write(WebDAVHelper.encodeHTML(I18NUtil.getMessage("webdav.err.dir")));
|
|
writer.write("</td></tr></table></body></html>");
|
|
writer.flush();
|
|
}
|
|
catch (IOException ioe)
|
|
{
|
|
}
|
|
}
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Given a path, will return the parent path. For example: /a/b/c
|
|
* will return /a/b and /a/b will return /a.
|
|
*
|
|
* @param path The path to return the parent of - must be non-null.
|
|
* @return String - parent path.
|
|
*/
|
|
private String parentFolder(String path)
|
|
{
|
|
if (path.endsWith(WebDAVHelper.PathSeperator))
|
|
{
|
|
// Strip trailing slash.
|
|
path = path.substring(0, path.length() - 1);
|
|
}
|
|
String[] paths = getDAVHelper().splitPath(path);
|
|
String parent = paths[0];
|
|
if (parent.equals(""))
|
|
{
|
|
parent = WebDAVHelper.PathSeperator;
|
|
}
|
|
return parent;
|
|
}
|
|
|
|
/**
|
|
* Formats the given size for display in a directory listing
|
|
*
|
|
* @param strSize The content size
|
|
* @return The formatted size
|
|
*/
|
|
private String formatSize(String strSize)
|
|
{
|
|
String strFormattedSize = strSize;
|
|
|
|
int length = strSize.length();
|
|
if (length < 4)
|
|
{
|
|
strFormattedSize = strSize + ' ' + WebDAVHelper.encodeHTML(I18NUtil.getMessage("webdav.size.bytes"));
|
|
}
|
|
else if (length >= 4 && length < 7)
|
|
{
|
|
String strLeft = strSize.substring(0, length - 3);
|
|
String strRight = strSize.substring(length - 3, length - 2);
|
|
|
|
StringBuilder buffer = new StringBuilder(strLeft);
|
|
if (!strRight.equals("0"))
|
|
{
|
|
buffer.append('.');
|
|
buffer.append(strRight);
|
|
}
|
|
buffer.append(' ').append(WebDAVHelper.encodeHTML(I18NUtil.getMessage("webdav.size.kilobytes")));
|
|
|
|
strFormattedSize = buffer.toString();
|
|
}
|
|
else
|
|
{
|
|
String strLeft = strSize.substring(0, length - 6);
|
|
String strRight = strSize.substring(length - 6, length - 5);
|
|
|
|
StringBuilder buffer = new StringBuilder(strLeft);
|
|
if (!strRight.equals("0"))
|
|
{
|
|
buffer.append('.');
|
|
buffer.append(strRight);
|
|
}
|
|
buffer.append(' ').append(WebDAVHelper.encodeHTML(I18NUtil.getMessage("webdav.size.megabytes")));
|
|
|
|
strFormattedSize = buffer.toString();
|
|
}
|
|
|
|
return strFormattedSize;
|
|
}
|
|
}
|