Skip to content

Commit

Permalink
Merge pull request IQSS#10172 from ErykKul/10116_incomplete_matadata_…
Browse files Browse the repository at this point in the history
…label_setting

10116 incomplete matadata label setting
  • Loading branch information
sekmiller authored May 14, 2024
2 parents ae78c04 + c40838c commit da3dd95
Show file tree
Hide file tree
Showing 12 changed files with 104 additions and 32 deletions.
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Bug fixed for the ``incomplete metadata`` label being shown for published dataset with incomplete metadata in certain scenarios. This label will now be shown for draft versions of such datasets and published datasets that the user can edit. This label can also be made invisible for published datasets (regardless of edit rights) with the new option ``dataverse.ui.show-validity-label-when-published`` set to `false`.
20 changes: 20 additions & 0 deletions doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2945,6 +2945,24 @@ Defaults to ``false``.
Can also be set via any `supported MicroProfile Config API source`_, e.g. the environment variable
``DATAVERSE_API_ALLOW_INCOMPLETE_METADATA``. Will accept ``[tT][rR][uU][eE]|1|[oO][nN]`` as "true" expressions.

.. _dataverse.ui.show-validity-label-when-published:

dataverse.ui.show-validity-label-when-published
+++++++++++++++++++++++++++++++++++++++++++++++

Even when you do not allow incomplete metadata to be saved in dataverse, some metadata may end up being incomplete, e.g., after making a metadata field mandatory. Datasets where that field is
not filled out, become incomplete, and therefore can be labeled with the ``incomplete metadata`` label. By default, this label is only shown for draft datasets and published datasets that the
user can edit. This option can be disabled by setting it to ``false`` where only draft datasets with incomplete metadata will have that label. When disabled, all published dataset will not have
that label. Note that you need to reindex the datasets after changing the metadata definitions. Reindexing will update the labels and other dataset information according to the new situation.

When enabled (by default), published datasets with incomplete metadata will have an ``incomplete metadata`` label attached to them, but only for the datasets that the user can edit.
You can list these datasets, for example, with the validity of metadata filter shown in "My Data" page that can be turned on by enabling the :ref:`dataverse.ui.show-validity-filter` option.

Defaults to ``true``.

Can also be set via any `supported MicroProfile Config API source`_, e.g. the environment variable
``DATAVERSE_API_SHOW_LABEL_FOR_INCOMPLETE_WHEN_PUBLISHED``. Will accept ``[tT][rR][uU][eE]|1|[oO][nN]`` as "true" expressions.

.. _dataverse.signposting.level1-author-limit:

dataverse.signposting.level1-author-limit
Expand Down Expand Up @@ -3142,6 +3160,8 @@ Defaults to ``false``.
Can also be set via any `supported MicroProfile Config API source`_, e.g. the environment variable
``DATAVERSE_UI_ALLOW_REVIEW_FOR_INCOMPLETE``. Will accept ``[tT][rR][uU][eE]|1|[oO][nN]`` as "true" expressions.

.. _dataverse.ui.show-validity-filter:

dataverse.ui.show-validity-filter
+++++++++++++++++++++++++++++++++

Expand Down
8 changes: 3 additions & 5 deletions src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -2296,13 +2296,11 @@ private void displayPublishMessage(){

public boolean isValid() {
if (valid == null) {
DatasetVersion version = dataset.getLatestVersion();
if (!version.isDraft()) {
if (workingVersion.isDraft() || (canUpdateDataset() && JvmSettings.UI_SHOW_VALIDITY_LABEL_WHEN_PUBLISHED.lookupOptional(Boolean.class).orElse(true))) {
valid = workingVersion.isValid();
} else {
valid = true;
}
DatasetVersion newVersion = version.cloneDatasetVersion();
newVersion.setDatasetFields(newVersion.initDatasetFields());
valid = newVersion.isValid();
}
return valid;
}
Expand Down
31 changes: 30 additions & 1 deletion src/main/java/edu/harvard/iq/dataverse/DatasetVersion.java
Original file line number Diff line number Diff line change
Expand Up @@ -1728,7 +1728,36 @@ public List<ConstraintViolation<DatasetField>> validateRequired() {
}

public boolean isValid() {
return validate().isEmpty();
// first clone to leave the original untouched
final DatasetVersion newVersion = this.cloneDatasetVersion();
// initDatasetFields
newVersion.setDatasetFields(newVersion.initDatasetFields());
// remove special "N/A" values and empty values
newVersion.removeEmptyValues();
// check validity of present fields and detect missing mandatory fields
return newVersion.validate().isEmpty();
}

private void removeEmptyValues() {
if (this.getDatasetFields() != null) {
for (DatasetField dsf : this.getDatasetFields()) {
removeEmptyValues(dsf);
}
}
}

private void removeEmptyValues(DatasetField dsf) {
if (dsf.getDatasetFieldType().isPrimitive()) { // primitive
final Iterator<DatasetFieldValue> i = dsf.getDatasetFieldValues().iterator();
while (i.hasNext()) {
final String v = i.next().getValue();
if (StringUtils.isBlank(v) || DatasetField.NA_VALUE.equals(v)) {
i.remove();
}
}
} else {
dsf.getDatasetFieldCompoundValues().forEach(cv -> cv.getChildDatasetFields().forEach(v -> removeEmptyValues(v)));
}
}

public Set<ConstraintViolation> validate() {
Expand Down
16 changes: 11 additions & 5 deletions src/main/java/edu/harvard/iq/dataverse/FilePage.java
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@
import edu.harvard.iq.dataverse.makedatacount.MakeDataCountLoggingServiceBean;
import edu.harvard.iq.dataverse.makedatacount.MakeDataCountLoggingServiceBean.MakeDataCountEntry;
import edu.harvard.iq.dataverse.privateurl.PrivateUrlServiceBean;
import edu.harvard.iq.dataverse.settings.JvmSettings;
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
import edu.harvard.iq.dataverse.util.BundleUtil;
import edu.harvard.iq.dataverse.util.FileUtil;
Expand Down Expand Up @@ -314,13 +315,18 @@ private void displayPublishMessage(){
}
}

Boolean valid = null;

public boolean isValid() {
if (!fileMetadata.getDatasetVersion().isDraft()) {
return true;
if (valid == null) {
final DatasetVersion workingVersion = fileMetadata.getDatasetVersion();
if (workingVersion.isDraft() || (canUpdateDataset() && JvmSettings.UI_SHOW_VALIDITY_LABEL_WHEN_PUBLISHED.lookupOptional(Boolean.class).orElse(true))) {
valid = workingVersion.isValid();
} else {
valid = true;
}
}
DatasetVersion newVersion = fileMetadata.getDatasetVersion().cloneDatasetVersion();
newVersion.setDatasetFields(newVersion.initDatasetFields());
return newVersion.isValid();
return valid;
}

private boolean canViewUnpublishedDataset() {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
*/
package edu.harvard.iq.dataverse.mydata;

import edu.harvard.iq.dataverse.DatasetServiceBean;
import edu.harvard.iq.dataverse.DataverseRoleServiceBean;
import edu.harvard.iq.dataverse.DataverseServiceBean;
import edu.harvard.iq.dataverse.DataverseSession;
Expand Down Expand Up @@ -63,7 +64,7 @@ public class DataRetrieverAPI extends AbstractApiBean {
private static final String retrieveDataPartialAPIPath = "retrieve";

@Inject
DataverseSession session;
DataverseSession session;

@EJB
DataverseRoleServiceBean dataverseRoleService;
Expand All @@ -81,6 +82,8 @@ public class DataRetrieverAPI extends AbstractApiBean {
//MyDataQueryHelperServiceBean myDataQueryHelperServiceBean;
@EJB
GroupServiceBean groupService;
@EJB
DatasetServiceBean datasetService;

private List<DataverseRole> roleList;
private DataverseRolePermissionHelper rolePermissionHelper;
Expand Down Expand Up @@ -491,7 +494,8 @@ private JsonArrayBuilder formatSolrDocs(SolrQueryResponse solrResponse, RoleTagR
// -------------------------------------------
// (a) Get core card data from solr
// -------------------------------------------
myDataCardInfo = doc.getJsonForMyData();

myDataCardInfo = doc.getJsonForMyData(isValid(doc));

if (doc.getEntity() != null && !doc.getEntity().isInstanceofDataFile()){
String parentAlias = dataverseService.getParentAliasString(doc);
Expand All @@ -514,4 +518,8 @@ private JsonArrayBuilder formatSolrDocs(SolrQueryResponse solrResponse, RoleTagR
return jsonSolrDocsArrayBuilder;

}

private boolean isValid(SolrSearchResult result) {
return result.isValid(x -> true);
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -292,7 +292,7 @@ public String getSolrFragmentForPublicationStatus(){
}

public String getSolrFragmentForDatasetValidity(){
if ((this.datasetValidities == null) || (this.datasetValidities.isEmpty())){
if ((this.datasetValidities == null) || (this.datasetValidities.isEmpty()) || (this.datasetValidities.size() > 1)){
return "";
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -835,16 +835,7 @@ public SolrInputDocuments toSolrDocs(IndexableDataset indexableDataset, Set<Long
solrInputDocument.addField(SearchFields.DATASET_PERSISTENT_ID, dataset.getGlobalId().toString());
solrInputDocument.addField(SearchFields.PERSISTENT_URL, dataset.getPersistentURL());
solrInputDocument.addField(SearchFields.TYPE, "datasets");

boolean valid;
if (!indexableDataset.getDatasetVersion().isDraft()) {
valid = true;
} else {
DatasetVersion version = indexableDataset.getDatasetVersion().cloneDatasetVersion();
version.setDatasetFields(version.initDatasetFields());
valid = version.isValid();
}
solrInputDocument.addField(SearchFields.DATASET_VALID, valid);
solrInputDocument.addField(SearchFields.DATASET_VALID, indexableDataset.getDatasetVersion().isValid());

final Dataverse dataverse = dataset.getDataverseContext();
final String dvIndexableCategoryName = dataverse.getIndexableCategoryName();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -355,8 +355,7 @@ The real issue here (https://github.com/IQSS/dataverse/issues/7304) is caused
* https://github.com/IQSS/dataverse/issues/84
*/
int numRows = 10;
HttpServletRequest httpServletRequest = (HttpServletRequest) FacesContext.getCurrentInstance().getExternalContext().getRequest();
DataverseRequest dataverseRequest = new DataverseRequest(session.getUser(), httpServletRequest);
DataverseRequest dataverseRequest = getDataverseRequest();
List<Dataverse> dataverses = new ArrayList<>();
dataverses.add(dataverse);
solrQueryResponse = searchService.search(dataverseRequest, dataverses, queryToPassToSolr, filterQueriesFinal, sortField, sortOrder.toString(), paginationStart, onlyDataRelatedToMe, numRows, false, null, null, !isFacetsDisabled(), true);
Expand Down Expand Up @@ -1489,9 +1488,14 @@ public boolean isRetentionExpired(SolrSearchResult result) {
return false;
}
}

private DataverseRequest getDataverseRequest() {
final HttpServletRequest httpServletRequest = (HttpServletRequest) FacesContext.getCurrentInstance().getExternalContext().getRequest();
return new DataverseRequest(session.getUser(), httpServletRequest);
}

public boolean isValid(SolrSearchResult result) {
return result.isValid();
return result.isValid(x -> permissionsWrapper.canUpdateDataset(getDataverseRequest(), datasetService.find(x.getEntityId())));
}

public enum SortOrder {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.function.Predicate;
import java.util.logging.Logger;

import edu.harvard.iq.dataverse.*;
Expand All @@ -19,6 +20,7 @@

import edu.harvard.iq.dataverse.api.Util;
import edu.harvard.iq.dataverse.dataset.DatasetThumbnail;
import edu.harvard.iq.dataverse.settings.JvmSettings;
import edu.harvard.iq.dataverse.util.DateUtil;
import edu.harvard.iq.dataverse.util.json.JsonPrinter;
import edu.harvard.iq.dataverse.util.json.NullSafeJsonBuilder;
Expand Down Expand Up @@ -402,15 +404,15 @@ public JsonArrayBuilder getRelevance() {
*
* @return
*/
public JsonObjectBuilder getJsonForMyData() {
public JsonObjectBuilder getJsonForMyData(boolean isValid) {

JsonObjectBuilder myDataJson = json(true, true, true);// boolean showRelevance, boolean showEntityIds, boolean showApiUrls)

myDataJson.add("publication_statuses", this.getPublicationStatusesAsJSON())
.add("is_draft_state", this.isDraftState()).add("is_in_review_state", this.isInReviewState())
.add("is_unpublished_state", this.isUnpublishedState()).add("is_published", this.isPublishedState())
.add("is_deaccesioned", this.isDeaccessionedState())
.add("is_valid", this.isValid())
.add("is_valid", isValid)
.add("date_to_display_on_card", getDateToDisplayOnCard());

// Add is_deaccessioned attribute, even though MyData currently screens any deaccessioned info out
Expand Down Expand Up @@ -1256,7 +1258,19 @@ public void setDatasetValid(Boolean datasetValid) {
this.datasetValid = datasetValid == null || Boolean.valueOf(datasetValid);
}

public boolean isValid() {
return datasetValid;
public boolean isValid(Predicate<SolrSearchResult> canUpdateDataset) {
if (this.datasetValid) {
return true;
}
if (!this.getType().equals("datasets")) {
return true;
}
if (this.isDraftState()) {
return false;
}
if (!JvmSettings.UI_SHOW_VALIDITY_LABEL_WHEN_PUBLISHED.lookupOptional(Boolean.class).orElse(true)) {
return true;
}
return !canUpdateDataset.test(this);
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -230,6 +230,7 @@ public enum JvmSettings {
SCOPE_UI(PREFIX, "ui"),
UI_ALLOW_REVIEW_INCOMPLETE(SCOPE_UI, "allow-review-for-incomplete"),
UI_SHOW_VALIDITY_FILTER(SCOPE_UI, "show-validity-filter"),
UI_SHOW_VALIDITY_LABEL_WHEN_PUBLISHED(SCOPE_UI, "show-validity-label-when-published"),

// NetCDF SETTINGS
SCOPE_NETCDF(PREFIX, "netcdf"),
Expand Down
2 changes: 1 addition & 1 deletion src/main/webapp/file.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@
<h:outputText value="#{bundle['dataset.versionUI.unpublished']}" styleClass="label label-warning" rendered="#{!FilePage.fileMetadata.datasetVersion.dataset.released}"/>
<h:outputText value="#{bundle['dataset.versionUI.deaccessioned']}" styleClass="label label-danger" rendered="#{FilePage.fileMetadata.datasetVersion.deaccessioned}"/>
<h:outputText value="#{FilePage.fileMetadata.datasetVersion.externalStatusLabel}" styleClass="label label-info" rendered="#{FilePage.fileMetadata.datasetVersion.externalStatusLabel!=null and FilePage.canPublishDataset()}"/>
<h:outputText value="#{bundle['incomplete']}" styleClass="label label-danger" rendered="#{FilePage.fileMetadata.datasetVersion.draft and !FilePage.fileMetadata.datasetVersion.valid}"/>
<h:outputText value="#{bundle['incomplete']}" styleClass="label label-danger" rendered="#{!FilePage.valid}"/>
<!-- DATASET VERSION NUMBER -->
<h:outputText styleClass="label label-default" rendered="#{FilePage.fileMetadata.datasetVersion.released and !(FilePage.fileMetadata.datasetVersion.draft or FilePage.fileMetadata.datasetVersion.inReview)}"
value="#{bundle['file.DatasetVersion']} #{FilePage.fileMetadata.datasetVersion.versionNumber}.#{FilePage.fileMetadata.datasetVersion.minorVersionNumber}"/>
Expand Down

0 comments on commit da3dd95

Please sign in to comment.